It is difficult to specify an average without sound statistical data, but you can make a few reasonable assumptions, resulting in a usable estimate. First, a little background:
Any electrical unit consumes electrical power, measured in watt (W). Consume electrical power over time means to 'consume' electrical energy; for example, burn a 100W light bulb for one hour 'consumes' 100Wh, or 0.1kWh, electrical energy. It is the number of kWh that you pay to your electricity supplier; for example, one electricity supplier in the UK charges £0.1152 per kWh.
(I write consume in quotes because it is not really possible to consume energy. According to the first and second law of thermodynamics, energy is only ever converted, but never consumed. The light bulb, for example, converts electrical energy into light and heat.)
Back to the computer.
An average desktop computer is designed to 'consume' up to 250W, but in normal-to-light use, it will consume no more than than 100W. When left to idle, modern computers switch off a number of non-essential units, leading to even lower power consumption (50W?).
Laptop computers will typically consume around 50% of a desktop PC's power consumption.
A CRT monitor will consume 150..250W, depending on its size, whether the computer is idle or not. However, most modern systems are configured to turn-off parts of the monitor in order to reduce the power consumption. This will reduce power to a value in the region of 30W.
Modern LCD displays will consume approximately 50% of the power of a CRT.
Other components, such as keyboards, mice, etc, are negligible.
So, assume your PC is a desktop PC, in use for 3 hours with typical applications, and idle (or in very low and sporadic use) for the other 5 hours:
PC energy = 3h * 100W + 5h * 50W = 300Wh + 250Wh = 550Wh
The display will have similar on/off times, so assume this:
LCD energy = 3h * 80W + 5h * 35W = 240Wh + 175Wh = 415 Wh
Total energy = PC energy + LCD energy = 550Wh + 415Wh = 965Wh (approximately 1 kWh).
So, running this PC for eight hours consumes approximately 1 kWh electrical energy, costing as little as £0.1152 in the above example. While this seems inexpensive, note that this builds up to 24/8 * 365 * £0.1152 = £126 per year.
You might use these figures for an initial estimation, but you must note that the true cost (i.e. the true power consumption) very much depends on make and model, and how you use this PC. Light word processing consumes little power in comparison with an action-packed video game, burning a DVD, or some serious number-crunching.
For a more accurate estimate, you would need to gather statistical data and review the above.
On average, a 32W T8 fluorescent bulb running 24 hours a day for 30 days at an average electricity rate of $0.12 per kilowatt-hour would cost about $7.84 per month.
A computer monitor typically uses around 30-60 watts of electricity on average.
On average, a computer uses about 60 to 300 watts of energy per hour, depending on its size and usage.
To calculate the cost of running a device that consumes 185 watts per day, you need to know the cost of electricity per kilowatt-hour. Assuming an average cost of $0.12 per kWh, running a 185-watt device for 24 hours a day would cost about $0.50 per day (185 watts / 1000 * 24 hours * $0.12).
Average cost is the total cost of producing a given quantity of output divided by the quantity produced. Variable cost is the cost that varies with the level of output produced. It includes costs such as raw materials, labor, and utilities that are directly related to production.
Usually the Sony computers are more expensive than the other computers. The average base Sony computer cost about $1000.
it all depends on the kid and the parent, but the average is about 2 hours a day
The cost for a Sony VAIO computer is eight hundred to one thousand dollars depending on the features. NOT the color. The price has nothing to do with the color.
how much does a 1995 525i bmw computer cost?
It would cost $2.24 in electricity to run the computer 50 hours in a week.
how much did an average car cost in 1996?
minecraft cost 7.99 or 6.99 for any computer
I say up to 4 to 5 hours once i lasted that long after that i got bored
Online video games... or just spending too much time on the computer. American spend an average 17 hours with the TV and Computer alone.
The average cost to replace a shift solenoid on your Dodge Durango is $175. The shift solenoid will cost approximately $50 and two hours of labor.
The average cost depends on the size and wattage. The average life of a tv lamp is eight thousand hours. Here is more information on prices: http://www.dlptvreview.com/dlptvreviews/projection-lamp-replacement.html
depended on the computer