It's 75/120 and the answer is in amps.
0.8A
0.8A
Current = (voltage) / (resistance) = 100/130 = 0.769 A = 769 milliamperes (rounded)
im not sure if this will work or not but... If you use a 9v battery threw an inverter, then a step-up transformer to convert 9v DC to 120v ac i think it will light the bulb, however current decreases when you do that and i dont know how much a light bulb needs..worth a try, i might actually try it now
It can, but if you're wanting to run a 120v light bulb on DC, you'll need 120v DC to get the rated output. That's a lot of batteries. It's easier, and more sensible, to find a DC rated light bulb, such as an RV bulb.
What specifically are you wiring? A light bulb would operate dim, a motor will burn up. The current increases thereby requiring larger wire and current protection.
Off hand no but your explanations are not clear as to who is doing what to whom,,
A 65 Watt incandescent light bulb should draw 65W/120V = 541.67mA
To light a light bulb you need to connect to a source of electricity like a battery or an electrical outlet.
A light bulb is an electric light that uses the process of a filament wire that is heated to produce an electrical current. When the electrical current passing throught the light bulb it produces light.
Let's examine what it means when a bulb is 100W rather than 60W. I'm assuming that you meant to state that they are 120V bulbs being connected to a 240V circuit1. With the same voltage on each, and because power is voltage times current, the current must be greater in a 100W bulb than in a 60W bulb. Since a incandescent bulb is a linear load, if you double the voltage then you double the current2. So the current through the 100W bulb is still greater than through the 60W bulb. Or you may analyze it a bit more. With both on 120V, for more current to flow in the 100W bulb, the resistance of it must be less than that of the 60W bulb. So you may generalize that under any voltage (same voltage applied to each), the 100W bulb will always have more current through it than the 60W bulb. 1Actually, if they are 120V bulbs in a 240V circuit, there is a high probability that they will blow out. But before they do, this is what will happen. 2Well, slightly less than double, because the temperature coefficient on the filament is positive, so the hotter it is, the greater the resistance. Although this may seem nonlinear, a light bulb or other temperature sensitive resistive element is still defined as linear if over the short term it obeys Ohms law at any instant of the waveform. The current in the 100 watt bulb will be greater. Power is current times voltage, so current is power divided by voltage. Voltage is the same is both cases of this question, so current is proportional to power at 240V.
No, but it's usually quite easy to run a 220V line to wherever you want to put the bulb.