To answer this question two more factors are needed. One is how long is the television on in a 24 hour period and the second is the amount that you pay per kilowatt hour on your electricity bill.
A 32" flat screen tv takes 100-150 watts.
you have to know how much voltage it is plugged into and how many amps it consumes. voltage x amps = watts. Look on the electrical plate on the back of the TV. for example in the US it might be 120 volts x 5 amps = 600 watts or .6 kilowatt hour (about 6 cents per hour). This tells you how much electricity the TV uses, not how many it has.
Depending on where you live in the USA, that amount of electrical energy costs between 17¢ and 38¢ per day.
Watt is a unit of power, or energy per time. Therefore, "watt per hour" is wrong: While the TV is on, it uses so-and-so many watts (or Joules/second), while it is off, it doesn't. TVs vary widely in their usage; CRTs (the big bulky ones) use more than the modern flat-screen TVs. Look at the back of your TV for electrical specifications. Perhaps you want to know how much you spend an hour. 200 Watts (for example) is the same as 200 watt-hours per hour, or 0.2 kilowatt-hours per hour. To convert this into money, look at a bill from the power company to see how much you spend for every kWh.
1,000 joules of heat energy for every second that the 1,000 watts' dissipation continues.
An average televeision uses 120-130 watts of power so if used for eight hours per day it uses up one unit of electrical energy, in the UK that costs £0.12.
If it conforms to Version 5.1 of the Energy Star Program Requirements, a 32" TV (16:9 aspect ratio) would use only 55 Watts or less, according to the document at http://www.energystar.gov/ia/partners/product_specs/program_reqs/tv_vcr_prog_req.pdf Anybody have a range for how much non-energy star units use?
Some televisions can use as much electricity as a major household appliance. When looking to buy a new big screen TV, be sure to look for the Energy Guide label. This will give an estimate of how much electricity costs will be for a year to run the television. Not only will buying an Energy Star compliant television help your wallet, but it will help the environment as well.
Convert the watts to kilowatts, multiply by the time to get the energy (in kWh), then multiply by the rate.
That will depends on (a) the type of television set, (b) the cost of electricity in your region, and (c) how long you keep it turned on. Take a look at the TV set's electrical specifications - how much it uses in watts. Most electric and electronic devices have a small metal plate that tells that. If it doesn't state the amount of watts, multiply volts x amperes to get watts. Multiply that by the number of hours you want to have it turned on, to get kilowatt-hours. Take a look at an electric bill, and divide the total amount by the number of kWh spent, to get an estimate of the cost per kWh.
A 32" flat screen tv takes 100-150 watts.
you have to know how much voltage it is plugged into and how many amps it consumes. voltage x amps = watts. Look on the electrical plate on the back of the TV. for example in the US it might be 120 volts x 5 amps = 600 watts or .6 kilowatt hour (about 6 cents per hour). This tells you how much electricity the TV uses, not how many it has.
3000 watts
The TV's electric efficiency depends on the type, size and brand of the Television. Typically a TV uses 80 to 400 Watts of electricity. Bigger Televisions use more energy than smaller ones. LCD's are more efficient than CRT's.
Depending on where you live in the USA, that amount of electrical energy costs between 17¢ and 38¢ per day.
It obviously varies from TV to TV, but between 0.3 and 10 watts, with newer ones taking less.
100 watts a day