Physics, asked by peerfarooqahmad88, 1 year ago

To produce 10 to the power 3 joule of heat in 10 seconds, how much voltage should be applied to 100ohm resistance.

Answers

Answered by vasanthij97
6

Given:

R=50 ohms , H=1000J, time=10 sec onds

Formula : H=V²t/R

1000=V²x10/50

50000/10=V²

5000=V²

V=70.7 Volts

∴ Voltage supplied must be 70.7volts.

Answered by muscardinus
1

The applied voltage is 100 volts.

Explanation:

Given that,

Heat produced, H=10^3\ J

Time, t = 10 s

Resistance, R = 100 ohms

We need to find the voltage that should be applied. The heat produced in a resistor is given by :

H=I^2Rt

Since, I=\dfrac{V}{R}

H=\dfrac{V^2t}{R}

V=\sqrt{\dfrac{HR}{t}}

V=\sqrt{\dfrac{10^3\times 100}{10}}

V = 100 volts

So, the applied voltage is 100 volts. Hence, this is the required solution.

Learn more,

Effects of electric current

https://brainly.in/question/716367      

Similar questions