Top 3 Products & Services
Dated: Jan. 13, 2010
Related CategoriesComputer Beginners Guides
- History, Origins, and Various Generations of Computers
- Charles Babbage - Father of Computing
- Fourth Generation of Computers
- Sixth Generation of Computers (Artificial Intelligence)
- Fifth Generation of Computers
- Generations of Computer
The computer as we see it today is a result of extensive research and development through the decades. The reason of origin of the computer and a brief history of its evolution are outlined below.
The word 'computer' comes from the word compute which means 'to calculate'. Computers were developed from calculators as the need arose for more complex and scientific calculations. Charles Babbage is known as the father of computers because of his immense contribution to the world of programming. His idea was soon developed into a programmable computer that could calculate and print logarithmic tables with huge precision. But there were many practical problems and the progress was slow.
During the World War, the U.S military had a demand for fast computers that could perform extremely complex calculations and weather predictions in minutes. This was when the ENIAC was built, by a partnership between University of Pennsylvannia and the U.S government. After the landmark "Von Neumann Architecture" was introduced it considerably increased the speed of the computer since it used only one memory. The EDVAC and UNIVAC were built based on this architecture using vacuum tubes. All the computers built from 1941-1956 are called 'first generation computers'.
The transistor was invented in 1947 by William Shockley and Walter Brattain in the Bell Laboratories of U.S.A. The 'second generation' of computers utilized transistors, stored memory concept and magnetic-core memory making them smaller, faster, more reliable and more energy efficient than their predecessors. Throughout the 1960's there were a number of computers used commercially for large businesses and in universities like Honeywell and IBM.
The transistors were found to damage the more sensitive parts of a computer since they generated a lot of heat. In 1958, the IC (Integrated Circuit) was invented by Jack Kilby. This revolutionized computing since all the electronic components were on a single semiconductor chip made of silicon, drastically reducing the size of the computer. Also, new operating systems were developed, which allowed the running of many applications simultaneously. These computers developed from 1964-1971 were the 'third generation' machines.
The microprocessor ushered in the 'fourth generation' of computers. Thousands of integrated circuits were fit into a single miniscule chip using VLSI and ULSI technology. This made the computers smaller, more portable and much faster than before. In 1981, IBM introduced a computer especially designed for use at home. Apple followed with its 'Macintosh'. These small computers were very powerful and permitted linking of several machines that eventually led to networking and the internet.
The 'fifth generation' computers include the present day computers and the ones that are being developed. These devices are based on the concept of 'Artificial Intelligence'. They utilize various new technologies like 'Quantum computing' and 'superconductors'. One of the applications is in 'Voice recognition' which is a software that is used to recognize the user's voice and respond to it. Parallel processing is a relatively new concept that is still in the nascent stage but has immense potential.
Now that you've gotten free know-how on this topic, try to grow your skills even faster with online video training. Then finally, put these skills to the test and make a name for yourself by offering these skills to others by becoming a freelancer. There are literally 2000+ new projects that are posted every single freakin' day, no lie!