The History Of The Computer
Every great song is a continuation of a previous great song, so be it with the computer. Primitive forms of the modern day computer originated 5000 years ago. It all started with the development of the Abacus, which gave way to the pencil and paper. This rudimentary computing style continued for thousands of years. The post-World War II era saw the first generation of what is now known as the modern day computer. The evolution of the computer has continued its steadfast development in an accelerated manner only restricted by technological wherewithal, and possibly human imagination.
Pencil and paper, the computing method of choice for thousands of years after the introduction of the Abacus, continued to exist until the mid-1600s; then the Pascaline was developed. This basic computer relied on movable dials for its computation functionality; however, its abilities were limited to addition only.
It was not until the mid-1800s, when Thomas Colmar developed a machine that could perform the basic arithmetic functions; it could add, subtract, multiply, and divide. This computer, called the arithometer, was widely used until the start of the First World War. Colmar along with preceding inventors Pascal, and Leibniz is accredited for defining mechanical computation, as we know it today.
England\’s own Charles Babbage, mathematics professor, designed the first multi-functional computer in 1830, which he named the Analytical Engine. Although this device never was constructed, the concept was considered a breakthrough.
To deal with the expanding United States population during the late 1800\’s, census takers determined they needed a faster, and more comprehensive computing system. The previous census was completed in seven years, and there were fears that it would take at least 10 years to complete the latest census. In 1890 Herman Hollerith, an American inventor, improved and further developed the punch card reader as a method for storing data and reducing computational errors. Using Hollerith\’s invention, census takers compiled the results in less than two months! In 1927, this new computational and storage system was soon introduced into the business world, eventually becoming known as International Business Machines (IBM). This method of using punch card readers for data processing continued until 1965.
The Second World War initiated the further development of the computer. The potential strategic importance of a computer now hastened the technological development process. This new urgency prompted the invention of the transistor in 1948, which changed the computer so drastically it was now called a supercomputer. This meant computers were now able to store programs, and programming language, which enabled them finally to be cost-effective and useful for the business world. The entire software industry now began.
Transistors were clearly a crucial breakthrough, but there was a serious problem; their operating temperatures were too high. As a result, the integrated circuit was developed, combining electronic components onto a small silicone disk which was made from quartz. This small silicone disk became known as the semi-conductor. Taking it one-step further, by the 1980s thousands of components were being loaded onto a single chip, which by now is called a microprocessor. This microprocessor could be programmed to meet various requirements such as television sets, automobiles, and recorders.
IBM introduced the first personal computer (PC) in 1981. 10 years later 65 million PCs were being used. As computer usage became more widespread, linking them together, or networking, created new ways to harness their potential. This enabled the computers to share memory space, software, information, and to communicate with each other. Using either cable or telephone lines, Local Area Network commonly known as (LAN), created networks of monumental proportions. Global computer circuitry, the Internet, now linked computers around the world into a single information network.
Artificial intelligence, spoken word instructions, and human reasoning are just a few of the modern engineering advances in computer technology. Easily understood user-friendly computer design is technologically complicated. In order to look into the crystal ball of the future one has to understand and appreciate how far we have come since the invention of the integrated circuit; our imagination is our only limitation.
Written by: Raymond G. James
Raymond G. James is a writer and a website developer living in Minnesota. For more news and information, please visit: http://www.internetobservations.net or http://www.cookingandmore.net
Raymond G. James is a writer and a website developer living in Minnesota.For more information please visit: http://www.internetobservations.net or http://www.cookingandmore.net
Author Bio: Raymond G. James is a writer and a website developer living in Minnesota. For more news and information, please visit: http://www.internetobservations.net or http://www.cookingandmore.net
Category: Internet
Keywords: abacus,computer,Pascaline,Liebniz,transistor,IBM,LAN,internet,PCs,microprocessor,integrated circuits