Home or personal computer consumes more electricity than television.
Home computer consumes over 500-550 kilowatt-hour per year while TV consumes over 200-250 kilowatt-hour per year. Both are in regular usage.
This is not a very good question because it leaves out a lot of important details. For example, are we talking about a simple 20-inch box fan, a 10-inch personal fan, or a big blower unit for a central air conditioning system? How big is the television? Is it a little 15-inch LCD television that one may have on the kitchen counter or is it the 60-inch widescreen curved LCD television in the living room or even an old 32-inch CRT (old-style tube) television that has not yet died and been replaced by modern technology?
If one looks at, say a 40-inch smart LCD television, it may use as little as 40 watts (as my Samsung model does). Considering a 20-inch box fan (a pretty common device found in homes), it may use as much as 80 watts (as the one I just checked does). So, given this, the fan would use more electricity.
If your question is so that you can figure out how much each device costs per month to operate, then let me do a bit of the math for you. The 40-inch TV uses 40 watts, so it would take 25 hours of continuous use to use a single kilowatt-hour of electricity. As most folks may use the TV for, only about five or six hours a day, that means that it would take a few days to reach that single kilowatt-hour. To keep the math simple, let's use the 5 hours per day figure. So, that means that it would take five days to use that one kilowatt-hour. Over a 30-day billing cycle, that is 6 kilowatt-hours. In my part of the US, the cost per kilowatt-hour is about 13 cents, so that would mean that the TV would cost 78 cents over the course of a month.
Looking at the fan, if we presume for a moment that the average usage of five hours per day is the same, then with its power usage being double that of the TV (80 watts compared to 40 watts), the monthly bill would also be doubled at $1.56.
I hope all this helps. If you have any other questions, feel free to drop me a line and let me know.
Depends on the wattage of the device. Check the name plate of the device.
whichever has the largest wattage
energy is energy It does not matter what you are using the energy for, it matters how much energy you are using A 200 watt light bulb uses more energy than a 100 watt light bulb A 200 watt motor uses more energy than a 100 watt light bulb A 200 watt heater uses more energy than a 100 watt light bulb
no
3 pence an hour
no
It depends on were you live. Most use as much electricity as a 60W light bulb. There are some energy saving ones that would make it cheaper though.
Electricicty is consumed when hooked to something which turns electricity into something else. A lamp turns electricity into light, and use something up in the process. A fridge use electricity to cool its insides, and use some of it up. A fan use electricity to move air around, and some of it is used up.
No. A fan needs to be plugged into electricity to work, but it does not have a charger.
Most fans have an electric motor that runs off of electricity. These use magnetization to spin the fan
gun powder or light bulb/ use of electricity
None. Light bulbs use electricity, they do not create electricity.
Before the light bulb was invented, people used lanterns, candles, and fire for light. Today, with all the technology going on, we use electricity to produce light such as for the light bulb. So the most commonly used way to make light is by electricity.
It stores electricity used as a boost when first starting the fan.
because light bulbs use electricity ,and candles use fire, and electricity last longer
no not easily
No
yes
A fan can use both voltages depending on what the manufacturer nameplates the motor voltage to be. AC is the most common voltage but some smaller fans like power supply fans in computers use a DC voltage. Check the fan motor's nameplate to supply the correct voltage.