Physics, asked by Anonymous, 6 months ago

What do you mean by zero error of a screw gauge ? How is it accounted for ?​

Answers

Answered by llɱissMaɠiciaŋll
6

Explanation:

Zero error is caused by wear and tear of the screw, nut, spindle and / or the anvil.

When the spindle is closed so that the face of the spindle touches the anvil, the scales should read exactly 0.00mm.

( To eliminate the human error of applying different pressures to the spindle by different people, a rachet mechanism is incorporated in the instrument, so that whoever handles the gauge, a constant force acts on the spindle.)

After prolonged usage, when the spindle is closed so that the face of the spindle touches the anvil, the scales may not read 0.00mm. The deviations may be in positive direction (for ex., + 0.01 mm) or in negative direction (for ex., - 0.01mm).

This deviation, technically known as Zero Error, has to be ascertained before using the instrument.

Then after taking measurement the deviation has to be either deducted or added to the actual reading taken by the instrument, to get the correct measurement.

Answered by rawalkinjal33
11

Answer:

Due to mechanical errors sometimes, when the anvil and spindle end are brought in contact the zero mark of the circular slace does not coincide with the base line of main scale it's either above or below the base line of the main scale in which case the screw gauge is said to have a zero error it can be both positive and negative.

it's is accounted by subtracting the zero error ( with sgin ) from the observed reading in order to get the correct reading.

Correct Reading :- Observed Reading - Zero errors ( With sign )

Explanation:

Hope It Help to u

Similar questions