Computer Science, asked by am5692871, 1 month ago

the speed of computer is measured in bytes.​

Answers

Answered by kashvichaurasia819
2

Answer:

The clock speed of computers is usually measured in megahertz (MHz) or gigahertz (GHz). One megahertz equals one million ticks per second, and one gigahertz equals one billion ticks per second. You can use clock speed as a rough measurement of how fast a computer is.

Answered by mishtipandey029
0

Answer:

yes that measured by bytes

Similar questions