The real-world answer is that the bulb glows with dazzling brilliance for a few
seconds and then burns out.
Electrically, the technical answer is:
Assuming we're working with a traditional incandescent bulb, the filament is designed
to dissipate 25W when connected to 220V. We could easily calculate the resistance of
the filament from that information, but for this discussion, it doesn't matter. It's just 'R'.
The Power dissipated by anything is V2/R
So if the voltage across the device is doubled, the power dissipation increases by
a factor of 4 . Your incandescent bulb begins to dissipate 100w instead of 25w.
Refinement:
The resistance of any metallic conductor increases when the temperature of the
conductor rises. The temperature of the filament certainly increases when its
power dissipation increases.
So 'R' increases, and the dissipation settles at something a little less than 100
watts. But have no fear ... at 100% over-voltage, your bulb will definitely blow.
All three, on 110V a split receptacle, on 220V a baseboard heater, on 440V a construction heater or similar resistive load.
The 25w bulb, since it has the much higher resistance. The resistance can be derived from:P = V^2/RR = V^2/PFor the 100w bulb:R = 220^2/100 = 484 ohmsFor the 25w bulb:R = 220^2/25 = 1936 ohmsWhen connected in series, and then connected to 440V, the voltage across the 100w bulb would be:V = 440*484/(484+1936) = 88VThis is well within spec.The voltage across the 25w bulb would be:V = 440*1936/(484+1936) = 352vThis is way over spec, and would cause the bulb to fuse.Although this answer assumes that a light bulb is a linear resistor, they are not. The resistance of a light bulb changes significantly with voltage and filiament temperature. The 25w light bulb is still the one that fuses, but the non-linearity of the resistance needs to be understood.
If the frequency is kept the same, you will overexcite the transformer, and it will draw excessive current (similar to inrush currents). Insulation tests are performed on transformers above nominal voltage, but they are performed at higher than rated frequency to keep the volts per hertz roughly equal to prevent overexciting the core.
bcoz we dont require too much voltage when working in home so in home generally 220v is preferred & in industries 3phase connection is used for heavy loads (440v)
Depending on the design of the motor, it may just run with less torque/hp. It might run slightly hotter as well, depending on the load.
For the same power - Watts - you need to run twice as many amps at 220V than at 440V. For the same load, it'll pull half the amps at 220V than it did on 440V
All three, on 110V a split receptacle, on 220V a baseboard heater, on 440V a construction heater or similar resistive load.
Short answer: No. (unless you like ruining motors) Longer answer: Not really, but there are some motors that can be re-wired so that instead of 440v they will run on 220v. However, this is not that common. Conclusion: Check the motor to see if it is dual-voltage on it's nameplate. If it is, you can re-connect it to work on 220v. If it is NOT a dual-voltage motor, you will absolutely ruin it if you connect it to 220v.
The 25w bulb, since it has the much higher resistance. The resistance can be derived from:P = V^2/RR = V^2/PFor the 100w bulb:R = 220^2/100 = 484 ohmsFor the 25w bulb:R = 220^2/25 = 1936 ohmsWhen connected in series, and then connected to 440V, the voltage across the 100w bulb would be:V = 440*484/(484+1936) = 88VThis is well within spec.The voltage across the 25w bulb would be:V = 440*1936/(484+1936) = 352vThis is way over spec, and would cause the bulb to fuse.Although this answer assumes that a light bulb is a linear resistor, they are not. The resistance of a light bulb changes significantly with voltage and filiament temperature. The 25w light bulb is still the one that fuses, but the non-linearity of the resistance needs to be understood.
If the frequency is kept the same, you will overexcite the transformer, and it will draw excessive current (similar to inrush currents). Insulation tests are performed on transformers above nominal voltage, but they are performed at higher than rated frequency to keep the volts per hertz roughly equal to prevent overexciting the core.
Short answer: No. (unless you like ruining motors) Longer answer: Not really, but there are some motors that can be re-wired so that instead of 440v they will run on 220v. However, this is not that common. Conclusion: Check the motor to see if it is dual-voltage on it's nameplate. If it is, you can re-connect it to work on 220v. If it is NOT a dual-voltage motor, you will absolutely ruin it if you connect it to 220v.
bcoz we dont require too much voltage when working in home so in home generally 220v is preferred & in industries 3phase connection is used for heavy loads (440v)
As many as it was designed to be. Tonnage doesn't equate to voltage, and there are 12v automotive systems capable of such tonnage (particularly in tractor-trailer units). Could be 12v, 24v, 110v, 220v, 440v, etc.
Depending on the design of the motor, it may just run with less torque/hp. It might run slightly hotter as well, depending on the load.
in 3 phase motor u1,v1,w1 give to the short terminal and remaining v2,u2,w2 give t0 the in coming power supply of 440v
35mfd/440V