answersLogoWhite

0

To review, Moore's law is the observation that, over the history of computing hardware, the number of transistors on integrated circuits doubles approximately every two years. No one expects that this trend can continue indefinitely. The universe is not infinitely detailed in its construction, it is composed of particles which have a certain minimum size. Transistors are made of atoms. Even if you could figure out how to make a transistor (or something that does what a transistor does) out of a single atom, which is very unlikely, that would seem to be an absolute limit. Atoms do have component parts, electrons and protons and neutrons, but you can't make a transistor out of a single subatomic particle. And even if you could, that only means that you have a different limiting factor. So the compression of ever more transistors into a given amount of space has an inescapable limit. It has also been observed that the problem of cooling a computer becomes increasingly difficult as the density of the transistors increases. Each transistor produces waste heat. It is likely that there is a point at which your computer would just melt, from its own waste heat, if you managed to cram a sufficiently large number of transistors in a given space. So that is also a limiting factor.

This is not to say that computers will not continue to improve. There are doubtlessly many ways to improve computers, both in terms of hardware and software. However, Moore's Law cannot be taken literally. Transistor density will not continue to double every two years. There was a certain historical period when that happened, but that period does not last forever.
Ultimately, it is not possible to make a transistor smaller than one atom. Therefor at some point Moore's law must fail given the constraints of physical reality.

User Avatar

Davin Monahan

Lvl 13
3y ago

What else can I help you with?