which least number should be subtracted from 1000 so that 35 divides the difference completely please tell step by step
Answers
And so, 20 is the smallest number to be subtracted from 1000 so that the difference is exactly divisible by 35.
What should be least number subtracted from 1000 so that 35 divides the difference exactly?
According to Euclid's division algorithm,
a=bq+r
Where,
a= dividend
b=divisor
q=quotient
And r=remainder.
So let a=1000, b=35,
So we get,
1000= 35×28+20
Subtracting 20 from both sides,
1000–20=35×28+20–20
Thus, 980=35×28
Therefore, as seen above, 980 is perfectly divisible by 35.
And so, 20 is the smallest number to be subtracted
0 is the least number that should be subtracted from 1000 so that 35 divides the difference exactly.
Step-by-step explanation:
We have to find the least number that should be subtracted from 1000 so that 35 divides the difference exactly.
For this, we shall first divide 1000 by 35.
On dividing 1000 by 35, we get
Remainder = 20
So, 20 is the least number that should be subtracted from 1000 to make it exactly divisible by 35.
Verification:
We can verify this too by subtracting 20 from 1000,
we get 980.
And on dividing 980 by 35, we get remainder 0, which means it is exactly divisible by 35.