Want this question answered?
The average home fluorescent lamp consumes 40W of power. Running for one hour it will consume 0.04 KWh. Units of electricity are charged per Kilowatt hour.
It means that the power consumption of the bulb is 40 watts.
This is a bit less light than a 40W incandescent bulb (much less than a 9-watt CFL bulb, but twice as much as a 5-watt CFL mini-bulb).
Not very bright at all. A 40W light bulb is about 450 lm, so a 55 lm source would be about 1/8th as intense as a 40W bulb.
40w=.04kWh.04*12 hours=.48 kWh
Joules = watts x seconds. Just convert the minutes to seconds, then multiply.
Yes, your assumption is correct. Lamp fixtures are rated on how well they dissipate the heat given off from an incandescent light bulb. As CFL lamps run much cooler there is no problem using them in the same rated fixture that is incandescent rated.
If you mean the 40 watt light bulb inside, are you sure you are replacing it with a special 40W appliance bulb? A regular bulb will not last long at all inside of a refrigerator.
It's the 40W tube! because it's nonlinear: indeed it generates harmonics which increase the apparent power and thus the apparent energy.
If the fixture was the exact same, and one held a single bulb and one held a double bulb then NO. The light given off bulbs is marked as wattage when you look at the package. So a 100w bulb has less light than two 75w bulbs together, because the two equal 150w.
-- The voltage makes no difference. -- The 400W device dissipates ten times as much power as the 40W device does. We don't know how much of each one's power consumption is radiated in the form of heat, UV light, etc. But if the spectral distribution of their output is similar, then the one that dissipates more power produces more visible light, and appears brighter.
The heat dissipation is what the fixture is rated for. They are saying maximum heat of 25 watts so 40 watts is going to be too much.