If you look at the recent history of computers and
microchips, the technology has gotten much smaller, faster, and less expensive. Moore's
Law basically states that the number of transistors that can be manufactured and
arranged on a microchip in a cost effective manner will double roughly once every two
years.
The rule has held up quite well over time, although
some say the technology has to plateau sooner or later. The Law has completely gone
beyond what the man who originally uttered the formula could have imagined, as that was
in 1965:
The
complexity for minimum component costs has increased at a rate of roughly a factor of
two per year... Certainly over the short term this rate can be expected to continue, if
not to increase. Over the longer term, the rate of increase is a bit more uncertain,
although there is no reason to believe it will not remain nearly constant for at least
10 years. That means by 1975, the number of components per integrated circuit for
minimum cost will be 65,000. I believe that such a large circuit can be built on a
single wafer.
You can apply
the same rule outside of the computing world as well, if you look at the advancement in
technology with DVD players or iPods, where they became faster, had more capacity and
capability, and yet they became less expensive.
No comments:
Post a Comment