Are you saying that you plug in a lamp and another lamp on that extension cord goes dimmer? If so then the extension cord wire is of a smll guage size and causing a voltage drop from the outlet to the cord output.
A: absolutely not lamps needs current a long cord introduce IXR drop making less current for the lamp to use it
A: Because both item are connected is series. Any resistance connected in series will carry the same current no matter of the resistance value or the number of resistors. However for an incandescence lamp the value will change when turn on and change when it is hot, That is because lamps have different property then resistance when cold and hot
Assuming you are referring to house lamps, these are always connected in parallel with each other. Each lamp will draw a current, the value of which depends on the wattage of the lamps. As each lamp is added, the supply current will increase by the amount of current drawn by that lamp.
A: The current is not a function of voltage available but rather the power needed to light the lamp. To answer your question a 100 watts lamp will require 1.83 times more current for a lamp rated as 100 watts at 120 volts.It is a basic ratio 220:120AnswerI disagree with the previous answer. The power rating of a lamp only applies when the lamp is subject to its rated voltage -which is why both values are shown on the lamp (e.g. 60 W / 120 V).So, if you subject a lamp to less than its rated voltage, it will not achieve its rated power. In fact, the decrease in power will be significantly greater than the corresponding decease in voltage. It will certainly not 'compensate' by drawing more current!However, to directly answer your question, the current drawn by a lamp connected to a 220-V supply will indeed be greater than the current drawn by the same lamp connected to a 110-V supply.
It will if the batteries are connected in series. If they are connected in parallel, the lamp will burn longer, but not brighter.
If an insulator with a lamp is connected in an electrical circuit and the switch is turned on, the lamp will not light up. Insulators do not allow the flow of electric current, so the circuit will not be completed, and no electricity will reach the lamp to cause it to light up.
why does the lamp bright when it is connected to the AC supply
I = E/R
Since power is volts time amps, the current in a 60W lamp connected to 120V is 0.5A. Since a lamp is a resistive load, there is no need to consider power factor and phase angle, so that simplifies the explanation. ======================== Assuming this is an incandescent or halogen lamp (using a filament to make the light) there is a trick here: the resistance of a lamp filament varies with temperature and does not follow Ohm's law. The resistance will be much lower, thus the current will be much higher when the filament is cold, when the lamp is first connected. As the filament heats up, the resistance increases until it gets to a steady operating point of 0.5A. For a halogen lamp, the operating temperature is about 2800-3400K, so the R at room temperature is about 16 times lower than when hot... so when connected, the current is about 8A but drops rapidly. The current could be even higher if the lamp is in a cold environment. Non-halogen lamps operate at a lower temperature and would have a lower initial current--about 5A. And this all assumes the lamp is rated for 120V. If it is a 12V/60W lamp, the filament will probably break and create an arc, which may draw a very large current.
Parallel Parallel
The formula you are looking for is R = E/I. Resistance is stated in ohms.
For current to flow through the lamp, there must be a potential difference (voltage) applied across opposite ends of that lamp.