Computers, by our above definition, have been around for thousands of
years. One of the earliest computers was the abacus, a series of beads
arranged on metal rods. Beads could be slid and forth to operate on
numbers. This was a very rudimentary device and is not commonly thought
of as a computer in modern times. Our idea of computers involves
electricity and electronics.
Electricity makes computers much more efficient. The first computers
used an incredible amount of electricity, which changed voltages in
vacuum tubes to operate the computer. These computers were given
instructions using punch cards, and were behemoths, taking up entire
floors of buildings. Only the more privileged universities and
government facilities had access to them.
In the 1960's, the vacuum tube was replaced by the integrated circuit
and transistor. These greatly reduced the size and power consumption of
computers. They were still very large by today's standards, but more
institutions had access to computing power than ever before. At the end
of the decade, the microchip was invented, which reduced the size of the
computer even more.
By the end of the 1970's, computers were widespread in businesses.
Using a computer involved typing on a terminal (a keyboard and monitor
connected to a large central computer). Soon, parts became small enough
to allow many users to have a computer at their home. Thus the Personal
Computer, or PC, was born.
Since then, PC's have become tremendously more efficient. They are
much smaller, and yet have seen extreme performance gains. In addition
to these improvements, computers have become affordable enough for many
families worldwide.
No comments:
Post a Comment