Thursday, November 7, 2019
A Look at the History of Computers
A Look at the History of Computers Before the age of electronics, the closest thing to a computer was the abacus, although, strictly speaking, the abacus is actuallyà a calculator since it requires a human operator. Computers, on the other hand, perform calculations automatically by following a series of built-in commands called software. In the 20th century,à breakthroughs in technology allowed for the ever-evolving computing machines that we now depend upon so totally, we practically never give them a second thought. But even prior to the advent of microprocessors and supercomputers, there were certain notable scientists and inventors who helped lay the groundwork for the technology thats since drastically reshaped every facet of modern life. The Language Before the Hardware The universal language in which computers carry out processor instructions originated in the 17th century in the form of the binary numerical system. Developed by German philosopher and mathematician Gottfried Wilhelm Leibniz, the system came about as a way to represent decimal numbers using only two digits: the number zero and the number one. Leibnizs system was partly inspired by philosophical explanations in the classical Chinese text the ââ¬Å"I Ching,â⬠which explained the universe in terms of dualities such as light and darkness and male and female. While there was no practical use for his newly codified system at the time, Leibniz believed that it was possible for a machine to someday make use of these long strings of binary numbers.ââ¬â¹ In 1847, English mathematician George Boole introduced a newly devised algebraic language built on Leibnizs work. His ââ¬Å"Boolean Algebraâ⬠was actually a system of logic, with mathematical equations used to represent statements in logic. Equally important was that it employed a binary approach in which the relationship between different mathematical quantities would be either true or false, 0 or 1.à As with Leibniz, there were no obvious applications for Booleââ¬â¢s algebra at the time, however, mathematician Charles Sanders Pierce spent decades expanding the system, and in 1886, determined that the calculations could be carried out with electrical switching circuits. As a result, Boolean logic would eventually become instrumental in the design of electronic computers. The Earliest Processors English mathematician Charles Babbage is credited with having assembled the first mechanical computers- at least technically speaking. His early 19th-century machines featured a way to input numbers, memory, and a processor, along with a way to output the results. Babbage called his initial attempt to build the worldââ¬â¢s first computing machine the ââ¬Å"difference engine.â⬠The design called for a machine that calculated values and printed the results automatically onto a table. It was to be hand-cranked and would have weighed four tons. But Babbages baby was a costly endeavor. More than à £17,000 pounds sterling was spent on the difference engines early development. The project was eventually scrapped after the British government cut off Babbageââ¬â¢s funding in 1842. This forced Babbage to move on to another idea, an analytical engine, which was more ambitious in scope than its predecessor and was to be used for general-purpose computing rather than just arithmetic. While he was never able to follow through and build a working device, Babbageââ¬â¢s design featured essentially the same logical structure as electronic computers that would come into use in the 20th century. The analytical engine had integrated memory- a form of information storage found in all computers- that allows for branching, or the ability for a computer to execute a set of instructions that deviate from the default sequence order, as well as loops, which are sequences of instructions carried out repeatedly in succession.à Despite his failures to produce a fully functional computing machine, Babbage remained steadfastly undeterred in pursuing his ideas. Between 1847 and 1849, he drew up designs for a new and improved second version of his difference engine. This time, it calculated decimal numbers up to 30 digits long, performed calculations more quickly, and was simplified to require fewer parts. Still, the British government did not feel it was worth their investment. In the end, the most progress Babbage ever made on a prototype was completing one-seventh of his first design. During this early era of computing, there were a few notable achievements: The tide-predicting machine, invented by Scotch-Irish mathematician, physicist, and engineer Sir William Thomson in 1872, was considered the first modern analog computer.à Four years later, his older brother, James Thomson, came up with a concept for a computer that solved mathematical problems known as differential equations. He called his device an ââ¬Å"integrating machineâ⬠and in later years, it would serve as the foundation for systems known as differential analyzers. In 1927, American scientist Vannevar Bush started development on the first machine to be named as such and published a description of his new invention in a scientific journal in 1931. Dawn of Modern Computers Up until the early 20th century, the evolution of computing was little more than scientists dabbling in the design of machines capable of efficiently performing various kinds of calculations for various purposes. It wasnââ¬â¢t until 1936 that a unified theory on what constitutes a general-purpose computer and how it should function was finally put forth. That year, English mathematician Alan Turing published a paper titled, On Computable Numbers, with an Application to the Entscheidungsproblem, which outlined how a theoretical device called a ââ¬Å"Turing machineâ⬠could be used to carry out any conceivable mathematical computation by executing instructions. In theory, the machine would have limitless memory, read data, write results, and store a program of instructions. While Turingââ¬â¢s computer was an abstract concept, it was a German engineer named Konrad Zuse who would go on to build the worldââ¬â¢s first programmable computer. His first attempt at developing an electronic computer, the Z1, was a binary-driven calculator that read instructions from punched 35-millimeter film. The technology was unreliable, however, so he followed it up with the Z2, a similar device that used electromechanical relay circuits. While an improvement, it was in assembling his third model that everything came together for Zuse. Unveiled in 1941, the Z3 was faster, more reliable, and better able to perform complicated calculations. The biggest difference in this third incarnation was that the instructions were stored on an external tape, thus allowing it to function as a fully operational program-controlled system.à Whatââ¬â¢s perhaps most remarkable is that Zuse did much of his work in isolation. Hed been unaware that the Z3 was Turing complete, or in other words, capable of solving any computable mathematical problem- at least in theory. Nor did he have any knowledge of similar projects underway around the same time in other parts of the world. Among the most notable of these was the IBM-funded Harvard Mark I, which debuted in 1944. Even more promising, though, was the development of electronic systems such as Great Britainââ¬â¢s 1943 computing prototype Colossus and the ENIAC, the first fully-operational electronic general-purpose computer that was put into service at the University of Pennsylvania in 1946. Out of the ENIAC project came the next big leap in computing technology. John Von Neumann, a Hungarian mathematician whod consulted on ENIAC project, would lay the groundwork for a stored program computer. Up to this point, computers operated on fixed programs and altering their function- for example, from performing calculations to word processing. This required the time-consuming process of having to manually rewire and restructure them. (It took several days to reprogram ENIAC.) Turing had proposed that ideally, having a program stored in the memory would allow the computer to modify itself at a much faster pace. Von Neumann was intrigued by the concept and in 1945 drafted a report that provided in detail a feasible architecture for stored program computing.à à à His published paper would be widely circulated among competing teams of researchers working on various computer designs. In 1948, a group in England introduced the Manchester Small-Scale Experimental Machine, the first computer to run a stored program based on the Von Neumann architecture. Nicknamed ââ¬Å"Baby,â⬠the Manchester Machine was an experimental computer that served as the predecessor to the Manchester Mark I. The EDVAC, the computer design forà which Von Neumannââ¬â¢s report was originally intended, wasnââ¬â¢t completed until 1949. Transitioning Toward Transistors The first modern computers were nothing like the commercial products used by consumers today. They were elaborate hulking contraptions that often took up the space of an entire room. They also sucked enormous amounts of energy and were notoriously buggy. And since these early computers ran on bulky vacuum tubes, scientists hoping to improve processing speeds would either have to find bigger rooms- or come up with an alternative. Fortunately, that much-needed breakthrough was already in the works. In 1947, a group of scientists at Bell Telephone Laboratories developed a new technology called point-contact transistors. Like vacuum tubes, transistors amplify electrical current and can be used as switches. More importantly, they were much smaller (about the size of an aspirin capsule), more reliable, and they used much less power overall. The co-inventors John Bardeen, Walter Brattain, and William Shockley would eventually be awarded the Nobel Prize in physics in 1956. While Bardeen and Brattain continued doing research work, Shockley moved to further develop and commercialize transistor technology. One of the first hires at his newly founded company was an electrical engineer named Robert Noyce, who eventually split off and formed his own firm, Fairchild Semiconductor, a division of Fairchild Camera and Instrument. At the time, Noyce was looking into ways to seamlessly combine the transistor and other components into one integrated circuit to eliminate the process in which they had to be pieced together by hand. Thinking along similar lines, Jack Kilby, an engineer at Texas Instruments, ended up filing a patent first. It was Noyceââ¬â¢s design, however, that would be widely adopted. Where integrated circuits had the most significant impact was in paving the way for the new era of personal computing. Over time, it opened up the possibility of running processes powered by millions of circuits- all on a microchip the size of a postage stamp. In essence, itââ¬â¢s what has enabled theà ubiquitous handheld gadgets we use every day, that are ironically, much more powerful than the earliest computers that took up entire rooms.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.