Computer Science, asked by AnirudhSaxena3918, 5 hours ago

explain the concept of error in representing data in the computer

Answers

Answered by imanya98
0

Error bars are graphical representations of the variability of data and used on graphs to indicate the error or uncertainty in a reported measurement. They give a general idea of how precise a measurement is, or conversely, how far from the reported value the true (error free) value might be.

Similar questions