Computer Science, asked by gyal6, 6 months ago

Computers have continued to decrease in size but the processing power has increased. Explain.

Answers

Answered by PalakKumari1602
1

Answer:

All current computer device technologies are indeed limited by the speed of electron motion. This limitation is rather fundamental, because the fastest possible speed for information transmission is of course the speed of light, and the speed of an electron is already a substantial fraction of this. Where we hope for future improvements is not so much in the speed of computer devices as in the speed of computation. At first, these may sound like the same thing, until you realize that the number of computer device operations needed to perform a computation is determined by something else--namely, an algorithm.

"A very efficient algorithm can perform a computation much more quickly than can an inefficient algorithm, even if there is no change in the computer hardware. So further improvement in algorithms offers a possible route to continuing to make computers faster; better exploitation of parallel operations, pre-computation of parts of a problem, and other similar tricks are all possible ways of increasing computing efficiency.

"These ideas may sound like they have nothing to do with 'physical restrictions,' but in fact we have found that by taking into account some of the quantum-mechanical properties of future computer devices, we can devise new kinds of algorithms that are much, much more efficient for certain computations. We still know very little about the ultimate limitations of these 'quantum algorithms. "

The speed of computers is limited by how fast they can move information from where it is now to where it has to go next and by how fast that information can be processed once it gets here. An electronic computer computes by moving electrons around, so the physical restrictions of an electron moving through matter determine how fast such computers can run. It is important to realize, however, that information can move about a computer much faster than the electrons themselves. Consider a garden hose: When you turn on the faucet, how long does it take for water to come out the other end? If the hose is empty, then the amount of time is equal to the length of the hose divided by the velocity at which water flows down the hose. If the hose is full, then the amount of time it takes for water to emerge is the length of the hose divided by the velocity at which an impulse propagates down the hose, a velocity approximately equal to the speed of sound in water.

"The wires in an electronic computer are like full hoses: they are already packed with electrons. Signals pass down the wires at the speed of light in metal, approximately half the speed of light in vacuum. The transistorized switches that perform the information processing in a conventional computer are like empty hoses: when they switch, electrons have to move from one side of the transistor to the other. The 'clock rate' of a computer is then limited by the maximum length that signals have to travel divided by the speed of light in the wires and by the size of transistors divided by the speed of electrons in silicon. In current computers, these numbers are on the order of trillionths of a second, considerably shorter than the actual clock times of billionths of a second. The computer can be made faster by the simple expedient of decreasing its size. Better techniques for miniaturization have been for many years, and still are, the most important approach to speeding up computers.

"In practice, electronic effects other than speed of light and speed of electrons are at least as important in limiting the speed of conventional computers. Wires and transistors both possess capacitance, or C--which measures their capacity to store electrons--and resistance, R--which measures the extent to which they resist the flow of current. The product of resistance and capacitance, RC, gives the characteristic time scale over which charge flows on and off a device. When the components of a computer gets smaller, R goes up and C goes down, so that making sure that every piece of a computer has the time to do what it needs to do is a tricky balancing act. Technologies for performing this balancing act without crashing are the focus of much present research.

"As noted above, one of the limits on how fast computers can function is given by Einstein's principle that signals cannot propagate faster than the speed of light. So to make computers faster, their components must become smaller. At current rates of miniaturization, the behavior of computer components will hit the atomic scale in a few decades. At the atomic scale, the speed at which information can be processed is limited by Heisenberg's uncertainty principle. Recently researchers working on 'quantum computers' have constructed simple logical devices that store and process information on individual photons and atoms.

Similar questions