Friday, July 18, 2008

History of Computer

The history of computing hardware covers the history of computer hardware, its architecture, and its impact on software. Originally calculations were computed by humans, who were called computers,[1] as a job title. See the history of computing article for methods intended for pen and paper, with or without the aid of tables. For a detailed timeline of events, see the computing timeline article.The von Neumann architecture unifies our current computing hardware implementations.[2] The major elements of computing hardware are input,[3] output,[4] control[5] and datapath (which together make a processor),[6] and memory.[7] They have undergone successive refinement or improvement over the history of computing hardware. Beginning with mechanical mechanisms, the hardware then started using analogs for a computation, including water and even air as the analog quantities: analog computers have used lengths, pressures, voltages, and currents to represent the results of calculations.[8] Eventually the voltages or currents were standardized and digital computers were developed over a period of evolution dating back centuries. Digital computing elements have ranged from mechanical gears, to electromechanical relays, to vacuum tubes, to transistors, and to integrated circuits, all of which are currently implementing the von Neumann architecture.[9]Since digital computers rely on digital storage, and tend to be limited by the size and speed of memory, the history of computer data storage is tied to the development of computers. The degree of improvement in computing hardware has triggered world-wide use of the technology. Even as performance has improved, the price has declined,[10] until computers have become commodities, accessible to ever-increasing sectors[11] of the world's population. Computing hardware thus became a platform for uses other than computation, such as automation, communication, control, entertainment, and education. Each field in turn has imposed its own requirements on the hardware, which has evolved in response to those requirements.[12]Contents1 Earliest calculators 2 1801: punched card technology 3 1930s–1960s: desktop calculators 4 Advanced analog computers 5 Early digital computers 5.1 Program-controlled computers 5.1.1 Colossus 5.2 American developments 5.2.1 ENIAC 6 First-generation von Neumann machines 7 Second generation: transistors 8 Post-1960: third generation and beyond 9 Notes 10 References 11 Further reading 12 External links 12.1 British history Earliest calculatorsDevices have been used to aid computation for thousands of years, using one-to-one correspondence with our fingers.[13] The earliest counting device was probably a form of tally stick. Later record keeping aids include phoenician clay shapes which represented counts of items, probably livestock or grains, in containers.[14] The abacus was used for arithmetic tasks. The Roman abacus was used in Babylonia as early as 2400 BC. Since then, many other forms of reckoning boards or tables have been invented. In a medieval counting house, a checkered cloth would be placed on a table, and markers moved around on it according to certain rules, as an aid to calculating sums of money.[15]A number of analog computers were constructed in ancient and medieval time

No comments:

Post a Comment