The average LCD T.V. uses 350W. This meets ENERGY STAR requirements. The average plasma TV uses over 1000watts.
In any case, the question doesn't make sense. Watts is a measure of energy flow NOT an amount of energy. In half an hour a 350w TV uses 630,000 joules of energy. A 750watt TV, or any other 750w appliance, uses the same amount of energy in a quarter of an hour. 350watts means 350 joules per second and it makes no sense to say 350watts/second/halfhour. It is a 350w applicance whether it runs for half and hour or a week. It is always 350watts.
1,000 watts
100 watts
3/4 of watts
5.5 watts is 0.0055 kilowatts. in one hour the equipment uses 0.0055 kilowatt-hours.
A kilowatt is 1,000 watts. A 60 watt bulb uses 60 watts in an hour. So, in half an hour it uses 30 watts. Now if a kilowatt costs 20 cents, what does 0.03 kilowatt cost?
about half as many watts as the computer per hour a computer uses 200-500
about 15 per hour its not a lot
Half of what is in a horsepower.
2.4705 watts/hour
1000
Watts are units for measuring the rate of energy consumption. So it is meaningless to speak of how many watts something consumes in a length of time. (It would be like asking how many miles per hour a car drives in an hour.)Energy consumption may be measured in kilowatt-hours. A typical microwave consumes 1500 watts, which would be 1.5 kilowatt-hours in one hour.
no days just half an hour