Want this question answered?
That depends on the wiring, the light socket, switch(es) and any other equipment in the circuit. You should never use a bulb that is larger than the original circuit was designed for.
If by "consume" you mean "waste as heat", that would depend upon the design of the transformer, but would typically be a few watts of heat loss.
If your device uses 900 Watts at 7.5 Amps, then it requires 120 volts. If you want to use it where the supplied current is 220 volts, then you'll need a transformer - but only if the device can operate on 50 Hz. Most places that use 220 Volts supply it at 50 Hz. If your device says it can operate on 50 Hz you can use a transformer.
A Kilowatt hour is 1000 watts per hour. A 50 watt bulb will use just 50 watts per hour. Therefore over 12 hours the 50 watt bulb will use 50*12 watts = 600 watts or 0.6 of a kilowatt hour.
Yes.
Yes
Watts are power. If the lights were mostly or totally switched off, you'd have a circuit generating 600W of heat somewhere if the transformer still took 600W, not only that, but when you switched on, the 600W that the transformer was consuming, would not disappear, so the total drain would be 1.2kW. ---- Don't understand the above answer. The 600 watts on the transformer nameplate is the maximum amount of wattage that the transformer can produce and still be within its safety limits. It doesn't draw that wattage all the time. If you had two 50 watt lamps connected to the transformer then the transformer has the capacity of 500 watts left. The transformer will only produce the wattage that the load requests. The transformer has the ability to supply twelve 50 watt bulbs. 12 x 50 = 600. Any more bulbs than 12 and the transformer is in an overload condition.
That depends on the wiring, the light socket, switch(es) and any other equipment in the circuit. You should never use a bulb that is larger than the original circuit was designed for.
16 AWG is plenty large enough for a 50 or 100 watt lamp.
A 75 bulb will use more electricity.
If by "consume" you mean "waste as heat", that would depend upon the design of the transformer, but would typically be a few watts of heat loss.
Yes. It just won't be as bright.
If your device uses 900 Watts at 7.5 Amps, then it requires 120 volts. If you want to use it where the supplied current is 220 volts, then you'll need a transformer - but only if the device can operate on 50 Hz. Most places that use 220 Volts supply it at 50 Hz. If your device says it can operate on 50 Hz you can use a transformer.
A Kilowatt hour is 1000 watts per hour. A 50 watt bulb will use just 50 watts per hour. Therefore over 12 hours the 50 watt bulb will use 50*12 watts = 600 watts or 0.6 of a kilowatt hour.
A 50 watt bulb designed to run on 12 volts takes 4.17 amps. A 50 watt bulb designed to run on 230 volts takes 0.217 amps.
Your stereo system has an internal transformer in the power supply that is designed to operate at a given frequency. If it is designed to operate at the load your stereo system will use at 50 Hz, then yes. Otherwise this transformer will overheat when connected at the wrong frequency, and will burn out. You might be lucky and the power supply may be labelled to operate at 50 or 60Hz; if not, I recommend contacting the manufacturer. A voltage transformer will convert 50Hz 220 to 50Hz 110. To change frequency, you need special equipment.
Yes.