A lamp will only operate at its rated power at its rated voltage. So if you connected a 230-V lamp to a 110-V supply, its brightness would be very low. It does not matter whether the supply is AC or DC; a lamp would have the same brilliance at 230 V (AC) as at 230 V (DC).
As a rule of thumb for fixed-value resistances, a 10% drop in voltage results in a 19% drop in power.
Ohm's Law Volts = Current x Resistance Amps = V / R 110 / 20 = 5.5 Amps
On a three wire supply system if you connect the two 110V wires together and they are across the phase they will short out and trip the breaker. If the two 110V wires are supplied from across the phase and connected to a motor then the motor will run. If the 110V wires are on the same phase nothing will happen.
The voltage ratio of a potential, or voltage, transformer (PT or VT) depends upon the primary voltage to which it is connected. Accordingly, its voltage ratio varies considerably, as there is huge variety of system voltages throughout the world.Typically a VT's secondary voltage is standardised at 110 V which will match the full-scale deflection of a voltmeter connected to it (although it can also supply protective relays), while its primary voltage is then matched to the voltage of the system to which it is connected: in the UK, for example:11-kV:110 V33-kV:110 Vetc.
1,760Wh
Either 110 v or 220 and 110, depending on how your house is wired.
P=VI 60W = 110V x I I=0.54* A Hopefully in your house you are connected to 110 VAC and not DC. Same answer as above applies.
The current flowing through the 75-watt light bulb connected to a 110-volt wall outlet can be calculated by using the formula: Power = Voltage x Current. Therefore, the current flowing through the light bulb would be approximately 0.68 amps.
The current in the light bulb will be greater when connected to the 200-v source compared to the 110-v circuit, assuming the resistance of the light bulb remains constant. This is because current is directly proportional to voltage in an electrical circuit according to Ohm's Law (I = V/R), so a higher voltage will result in a greater current flow through the bulb.
The formulas you are looking for is I = E/R.
It will burn due excess current .
please tell your line voltage (like 230V,400KV,)!
The breaker will more than likely trip. If not then the appliance will not operate properly.
I=v/r =110/121 =0.909A
We have two bulb in parallel debiting 75 + 40 = 115 watts under 110 volts. I -current amperes V -potential volts W -power watts R -resistance ohms knowing W = V*I V = I*R W = R*I2 Then: 115 watts = 110 volts * I => I = 115/110 = 1,045 amperes R = 115/(115/110)2 = 1102 / 115 = 105,217 ohms ------------------------------------------------------------------------------------------- Another way: First get the resistance of each bulb. Then we know that Rparallel = 1/(1/R1 + 1/R2 ) 75 watts = 110 volts * I => I = 75/110 ampere. R1 = 75/(75/110)2 = 1102/ 75 = 161,333 ohms. for the other bulb 40watts = 110 volts * I => I = 40/110 ampere. R2 = 40/(40/110)2 = 1102/ 40 = 302,5 ohms. meaning Rparallel = 1/(1/161,333+1/302,5) = 105,217 ohms That it's
yes the bulb will actually last longer
Probably because years ago the voltage supply in most residential applications was 110/220V. It was increased to 120/240V to increase efficiency. If you use a heater element rated at 230V in a 240V application you will be "overvolting" it and will probably reduce the life of the element. If using an element rated at 240V in a 230V application you will be "undervolting" it and it will probably not get quite as warm as it is designed to.
No. The bulb will burn out.