Evolution Of Computer – Introduction
The term “computing,” which primarily involves counting, is the origin of the word “computer.” A computer is a general-purpose electronic device that is used to automatically conduct arithmetic and logical processes, as defined by the phrase “Commonly Operated Machine Particularly Used In Technical and Educational Research.”
With the development of computer technology today, its use is growing in the fields of research, technology, education, and even in our homes. But the sophisticated computers we use today underwent a number of long-ago evolution to get to this point, which we shall go into depth about today.
Evolution Of Computer – Brief History
To determine how many livestock they owned, Stone Age people used pebbles or scratched numbers onto the cave wall. They would place a fresh pebble whenever a baby was born, and they would take one away when someone passed away. They managed to document their regular activities in this way. As a result, they counted with their fingers.
However, the procedure was incredibly laborious and sluggish, therefore new methods were created to help with daily tasks.
- The ‘ABACUS,’ which was created in China approximately 500 B.C., is perhaps the oldest known counting device. It comprises a wooden frame with columns and rods with counting beads strung across them. The beads are typically divided into two halves by a crossbar. It is possible to add and subtract by taking out the beads. The term “Soroban” is still in use in China and Japan. People who are blind still frequently use a modified abacus made by Tim Cranmer, known as a Cranmer Abacus.
- A Scottish mathematician named John Napier invented the first tool to aid in multiplication and division. In 1617, ivory rods with many numbers carved on them using a logarithms table were manufactured as part of Napier’s “NAPIER’S BONES” technique. Simply adding two numbers yields the answer using the logarithm principles, which is how one calculates the product of two numbers.
- Pascal’s Adding Machine, often known as “PASCALINE,” was created by French mathematician and philosopher Blaise Pascal in 1642. It was the first calculator to perform calculations automatically after receiving a number. It was a mechanical calculator that could add and subtract in base 10 and was created with gears and wheels.
- The Pascaline machine, also known as “LEIBNIZ STEP RECKONER,” was created later in 1671 by the German mathematician Gottfried Wilhelm von Leibniz by analyzing the original notes and drawings made by Blaise Pascal. Along with performing huge additions and subtractions, this machine was also capable of performing multiplications and divisions.
- A professor at Cambridge University named Charles Babbage created the “DIFFERENCE ENGINE” in 1822, and it was later used to create astronomical tables for an observatory in Albany, New York. – Evolution
Later that year, in 1834, he created the “ANALYTIC ENGINE,” which is regarded as the forerunner of modern high-tech computers. Babbage is referred to as “THE FATHER OF COMPUTERS” because the fundamental ideas he established are still the foundation of modern digital computers.
Consequently, during the course of human history, computers began to play a bigger role in our daily lives.
The Historical Evolution Of Computer Generation:
We’ll talk about computer generation now. Although today’s computer generations are built on a combination of hardware and software technologies, the term “generation” was formerly used to refer to various hardware technologies. This allows us to categories the development of computers into five generations.
1. First Generation (About 1942-1955) – The Age Of VACUUM TUBE
Vacuum Tubes were a key component of the initial generation of computers’ technology. The vacuum tube, created in 1906 by American physicist Lee De Forest, was quickly adopted in favour of electromechanical relays in computers between 1942 and 1955. These computers are big in size, have high electrical consumption, and produce a lot of heat. – Evolution
1. J.P. Eckert and J.W. Mauchly created the ENIAC, also known as the Electronic Numerical Integrator and Computer. It was created in 1946 and was the first general-purpose electronic computer ever.
2. To be employed by the Ballistic Research Laboratory of the War Department, J.W. Mauchly created the UNIVAC, or UNIVERSAL AUTOMATIC COMPUTER. A group of engineers led by J. W. Mauchly and J.P. Eckert, a former student, constructed the computer over the period of three years. On June 14, 1951, the company constructed the first general-purpose electronic digital computer intended for business usage.
2. Second Generation (About 1956- 1964) – The Age Of TRANSISTOR
Second generation computers are thought to have started when semiconductor components were introduced. Transistors, which were utilized extensively in this generation, were more affordable, less energy-intensive, physically smaller, more dependable, and faster than the vacuum tube-based machines of the previous one. – Evolution
At Bell Laboratories in Murray Hill, New Jersey, William Shockley, Walter Houser Brattain, and John Bardeen developed the transistor in 1947. Programming languages like COBOL and FORTRAN were introduced around this time.
Example: – The most widely used computer during this time was the IBM 1401, Honeywell 400, IBM 1620, and CDC 1604. – Evolution
3. Third Generation (About 1965-1971) – The Age Of INTEGRATED CIRCUIT (IC)
The size of computing devices dropped overall with the adoption of transistors, but as computers became more powerful, more transistors were required, therefore the third generation of computers employed Integrated Circuits (ICs) in their place. The processing rates of this generation of computers were measured in nanoseconds, and the machines also grew more affordable, as a single integrated circuit (IC) comprises numerous transistors, resistors, and capacitors in addition to the circuit that goes with them.
The transistor was created in 1947 at Bell Laboratories in Murray Hill, New Jersey, by William Shockley, Walter Houser Brattain, and John Bardeen. Some programming languages, including COBOL and FORTRAN, were developed at this time.
4. Fourth Generation (About 1971-1985) – The Age Of MICROPROCESSOR
The term “very large scale integrated” (VLSI) refers to the amount of active transistors or components that may fit into a single integrated circuit (IC) chip, and is used to describe computers of the fourth generation. A microprocessor, which is regarded as the first of the fourth generation of computers, was created as a result of packing millions of transistors into a single integrated circuit (IC). Then, because of their accessibility, affordability, and advantages over analogue devices for daily tasks, computers started to be utilised personally. – Evolution
A group of logic architects and silicon engineers, including American-Italian scientist and engineer Federico Faggin, American engineers Marcian Hoff and Stanley Mazor, and Japanese engineer Masatoshi Shima, created the first microprocessor in January 1971. For the Japanese calculator company Busicom, they worked on this project. – Evolution
The 4-bit 4004 chip from Intel is commonly considered as the first microprocessor in history.
Example: – PCs from this era use Intel 8085, 80286, 386, and other processors. The first company to create a microprocessor was Intel. Our personal computers still use a lot of their Pentium processors.
Currently, the two most well-known processor manufacturers are Intel and AMD.
5. Fifth Generation (About 1985-Present) – The Age Of ARTIFICIAL INTELLIGENCE
The major difference between the fourth and fifth generation computer technologies is that the VLSI (Very Large Scale Integration) technology from the fourth generation of computers evolved into ULSI (Ultra Large Scale Integration) technology, enabling the production of microprocessor chips with ten million electronic components.
The fifth generation of computers was released with a brand-new technology known as ARTIFICIAL INTELLIGENCE (AI), which suggests that this generation of computers should be able to learn from their mistakes and will also be able to take commands from spoken language and emulate human reasoning. – Evolution
John McCarthy initially used the phrase artificial intelligence in 1956 when he hosted the first scholarly conference on the topic. In 2006, five years before his passing, he was also referred to as “The Father of Artificial Intelligence.”
Herbert Simon and Allen Newell created the first artificial intelligence software, Logic Theorist, in December 1955. – Evolution
Desktop, laptop, notebook, ultrabook, etc. are some examples.
As a result, after many years of development, we now have the modern computers that are so crucial to our daily lives. We should see more cutting-edge technology in the future.
Hopefully from the above discussion you have come to know some unknown and interesting information about the evolution of computers and its generation. Thank You.
Know about Cross Border Payments
Read about Wealth Management
Learn About That How To Earn Money From Freelancing?
Know About That How to Earn Money From Facebook?