Question Home

Position:Home>History> History of Computer ?


Question: History of Computer !?
Best Answer - Chosen by Asker:
It is difficult to identify any one device as the earliest computer, partly because the term "computer" has been subject to varying interpretations over time!. Originally, the term "computer" referred to a person who performed numerical calculations (a human computer), often with the aid of a mechanical calculating device!.
The history of the modern computer begins with two separate technologies - that of automated calculation and that of programmability!.
Examples of early mechanical calculating devices included the abacus, the slide rule and arguably the astrolabe and the Antikythera mechanism (which dates from about 150-100 BC)!. The end of the Middle Ages saw a re-invigoration of European mathematics and engineering, and Wilhelm Schickard's 1623 device was the first of a number of mechanical calculators constructed by European engineers!. However, none of those devices fit the modern definition of a computer because they could not be programmed!.
Hero of Alexandria (c!. 10 – 70 AD) built a mechanical theater which performed a play lasting 10 minutes and was operated by a complex system of ropes and drums that might be considered to be a means of deciding which parts of the mechanism performed which actions - and when!.[3] This is the essence of programmability!. In 1801, Joseph Marie Jacquard made an improvement to the textile loom that used a series of punched paper cards as a template to allow his loom to weave intricate patterns automatically!. The resulting Jacquard loom was an important step in the development of computers because the use of punched cards to define woven patterns can be viewed as an early, albeit limited, form of programmability!.
It was the fusion of automatic calculation with programmability that produced the first recognizable computers!. In 1837, Charles Babbage was the first to conceptualize and design a fully programmable mechanical computer that he called "The Analytical Engine"!.[4] Due to limited finances, and an inability to resist tinkering with the design, Babbage never actually built his Analytical Engine!.
Large-scale automated data processing of punched cards was performed for the U!.S!. Census in 1890 by tabulating machines designed by Herman Hollerith and manufactured by the Computing Tabulating Recording Corporation, which later became IBM!. By the end of the 19th century a number of technologies that would later prove useful in the realization of practical computers had begun to appear: the punched card, Boolean algebra, the vacuum tube (thermionic valve) and the teleprinter!.
During the first half of the 20th century, many scientific computing needs were met by increasingly sophisticated analog computers, which used a direct mechanical or electrical model of the problem as a basis for computation!. However, these were not programmable and generally lacked the versatility and accuracy of modern digital computers!.Www@QuestionHome@Com

I'm going to take a risk and go entirely by memory on this!. :-)

Depending on how you define "computer", they could conceivably be stated to have begun thousands of years ago!. Gears & cogs & logic chains & such are all that are needed for a technical definition!. I believe the first water clock would have qualified!.

But if you're talking exclusively about electronic computers, then they began in the very early 1900's with the invention of a behemoth of a machine called the ENIAC - it used vacuum tubes & wires, sucked up exhorbitent amounts of electricity just to multiply two two-digit numbers and generated tons of heat!. It took up several entire rooms!. To make a computer of that sort with the same capabilities of today's average desktop unit, you'd probably need a room about the size of half the moon!.

Vacuum tubes were done away with when transistors were invented, and that is when the binary revolution began!. After transistors, capacitors, resistors & such, microchips then came to be invented!. We still use microchips in pretty much everything today, but that's only because they haven't yet perfected quantum computing!. Once quantum computing enters the picture in its full glory, modern computers will instantly become obsolete!.Www@QuestionHome@Com