if the resistance across a 12V source is increased by 4 ohm the current drops by 0.5A the original resistance was?
Answers
Answer:
Say I1 is 1A. 12V/1A = 12 ohms. To cut current by 0.5A would mean 12V/(1A - 0.5A) = 24 ohms, meaning it would require 12 ohms to drop by 0.5A.
Say I1 = 2A. 12V/2A = 6 ohms. 12V/(2A - 0.5A) = 8 ohms. So that is 2 ohms to drop by 0.5A. That indicates to me that a solution is possible, somewhere between 6 and 12 ohms for the original resistance.
R1 = unknown original resistance
R2 = 4 ohms
I1 = original current
I1 = 12V/R1
I1 - 0.5A = 12V/(R1+R2)
I1 = 12V/(R1+R2) + 0.5A
12V/R1 = 12V/(R1+R2) + 0.5A
Hm. Kinda stuck here. I don’t find myself doing algebra every day, and I am a bit tired.
Let’s try something.
We know now the original resistance is between 6 and 12 ohms, the original current between 2A and 1A. Let’s split the difference on the current and see what happens.
12V/1.5A = 8 ohms
12V/(1.5A - 0.5A) = 12 ohms
Tada! That is a difference of 4 ohms.
Explanation:
Initially let I = 12/R amperes where, R is the original resistance.
When the R is increased by 4 ohms then:
12/(R + 4) = (I - 0.5) = (12/R) - 0.5
Rearranging:
12 = (R + 4)[ (12/R) - 0.5 ] = 12 - 0.5R + 48/R - 2 = 10 - 0.5R + 48/R
2 = 48/R - 0.5R
2R = 48 - 0.5R^2
0.5R^2 + 2R - 48 = 0
Solving the quadratic gives:
R = -12 or R = 8
Hence neglecting the negative value, the original resistance R = 8 ohms.
Check:
When R = 8 ohms, I = 12/8 = 1.5A
When R = 8 + 4 = 12 ohms, I = 12/12 = 1A