how is speed of computer measured?
Answers
Answered by
1
Answer:
Clock speed is measured by how many ticks per second the clock makes. ... The clock speed of computers is usually measured in megahertz (MHz) or gigahertz (GHz). One megahertz equals one million ticks per second, and one gigahertz equals one billion ticks per second.
Answered by
1
Answer:
The hertz (Hz) as a unit is defined as the number of cycles per second, making 1 megahertz (MHz) one million cycles per second, while 1 gigahertz is 10^9 hertz. Vibrations and electromagnetic radiation are measured in hertz units, but in computing it is the clock speed of the central processing unit that is referenced in terms of hertz.
Similar questions