A History of the Computer PC users in the Nashua NH area likely to take the modern home and business computer for granted, perhaps only ever thinking about how a computer actually works when finding themselves in need of computer repair or service.
The contemporary computer is part of a vast legacy that reaches back thousands of years. While we don't often think about the centuries of work that's gone into our laptop, its history is vital for anyone in the technology industry.
The Oxford English Dictionary defines a computer as "a device or machine for performing or facilitating calculation." At first, computers were people whose job was to simply calculate all day. Of course, human error was the evident flaw in these jobs, and people were soon seeking mechanical solutions for calculation. No doubt, various forms of primitive calculators arose, but it took a while for efficient calculation to get off the ground.
Around the fourth century B.C., the abacus was invented as a way to quickly perform simple calculations. This small device paved the way for mechanized mathematics, providing a foundation on which more advanced ideas could be built. Over the next several centuries, countless inventors (including Wilhelm Schickard, Blaise Pascal, and Charles Babbage) would go on to develop innovations in computing, most being calculators. While these were the building blocks for today's computer, the twentieth century produced the earliest versions of the computer as we know it.
In 1939, David Packard and Bill Hewlett started a technology company in a one-car garage, producing items like the Audio Oscillator. From here, the wheels were set in motion for a computerized generation. Computers like Konrad Zuse's Z3 and Howard Aiken's Mark I expanded on previous ideas and built more complex systems. These were both calculators that were game-changers in the computing world. Zuse's machine utilized binary mathematics and boolean logic, which remain integral elements of computers. Harvard professor Aiken teamed up with IBM to create the colossal Mark I, which spawned crucial concepts, such as debugging and compiling. These are both credited to Grace Hopper, one of the programmers behind the Mark I.
Notably, the ENIAC was an important development for the present-day computer. Short for Electronic Numerical Integrator and Calculator, it's widely considered one of the predecessors of digital computers. It was built by the Ballistics Research Laboratory for use in artillery but had a much more far-reaching impact. ENIAC developers John Mauchly and J. Presper Eckert went on to team up with mathematician John von Neumann to create the EDVAC. This second computer introduced the stored program to the world and further enhanced programming and developing.
One of the next big steps was the UNIVAC, one of the world's first commercial computers. It was another invention of Mauchly and Eckert and was the first to use magnetic tape. Another commercial computer came from IBM, whose 650 model became immensely popular. Over the next several decades, IBM would become one of the main computing giants, selling millions of units and developing some of the most influential devices in technological history.
Perhaps the most important changes in computing in the late twentieth century were changes in the way computers themselves worked. Until the early seventies, computers relied on paper tape to operate. Users would take one of two paths. The first was to use a teletype that interacted with the mainframe and printed out a response. The second involved generating programs on punch cards that were then run through the computer, again ending with a printout. Both processes were rather laborious and inefficient, so programmers began devising solutions to provide more user-friendly interfaces.
The answer came in 1971, when the microprocessor was invented. Intel took a tiny chip and forever revolutionized computers. The first microprocessor model, the 4004, was the beginning of Intel's legacy and was soon followed by other models. Their 8080 would be used in the MITS Altair 8800, which was the first personal computer. Users assembled it at home, and it became an instant hit. In addition to the milestone the Altair created, it also gave way to a new technological goliath.
In the mid-seventies, Paul Allen and Bill Gates made a career out of creating Altair programs. This partnership led to the creation of Microsoft. Around the same time, Steve Wozniak and Steve Jobs released the Apple II, which was an immediate success and placed Apple as a serious contender for technological domination. By 1980, they had taken over half of the personal computer market.
Both Microsoft and Apple produced innovations that still have impact today. Microsoft developed the MS-DOS operating system for IBM's PC, another smash hit in the personal computer world. In 1985, Microsoft made another huge leap forward in the form of Windows 1.0. Meanwhile, Apple released their Macintosh model in 1984. Its combination of useful features and an attractive price helped it take over the market and compete with IBM's unit. It was the first successful computer to use a mouse.
Storage became an important issue in personal computing. Floppy disks were one of the first attempts at storage and were used in such computers as Apple's failed Lisa and IBM's PC-AT. IBM standardized floppy disks with their PS/2 computers, and the disks remained the main method of storage for years. Around the same time, CD-ROMs were developed by Philips and Sony. These were able to hold much more information and proved to be a more efficient storage method. Over time, more portable solutions were sought, and flash drives were invented as a response.
Companies such as Hewlett-Packard, IBM, Microsoft, and Apple have stood the test of time and continue to produce incredible technology. Microsoft's Windows series and Apple's OS versions have captured the market and remain the two most used operating systems. The world-changing progressions in computer memory and operation have made a lasting impact, and it shows in the way the world has become nearly consumed by technology.
From the simple abacus to the intricate microprocessor, the road to the modern computer is indeed long and winding. The computer's history is essential knowledge for not only industry experts, but also everyday citizens who make use of the technology available to them. The story of the computer continues to be written, and one can only imagine the astounding developments that will arise in the future.