answersLogoWhite

0


Best Answer

It really depends on the computer. If you have an old-style CRT type monitor that stays on too, it may use far more power than the PC. You can figure the number for your particular setup, here's how: First, electrical consumtion is actually measured in kilowatt-hours, not kilowatts. A kilowatt is one-thousand watts. If you have a device that draws 1,000 watts, and you leave it on for one hour, you have consumed (and will have to pay for) one kilowatt-hour. Sometimes appliances are rated in watts, sometimes in amps. Simply multiply amps times 120 (assuming your in the US) to get watts. 100 watts is 0.1 kilowatts, 500 watts is 0.5 kilowatts, 1,230 watts would be 1.23 kilowatts. 3 amps is 0.36 kilowatts (3 X 120 = 360 watts = 0.36 kilowatts). Get it? Here's my setup as an example: My PC is a laptop. The power supply says 95 watts. That's 0.095 kilowatts. I put the laptop in a docking station so I can use a real keyboard and monitor. My monitor says 2 amps. That's 0.24 kilowatts (2 X 120 = 240 watts). See, my monitor actually uses more electricity than the laptop! That's why most modern monitors have a power-save feature where they turn off after a period of inactivity. So, the total power needed is 0.335 kilowatts (0.24 + 0.095). If I leave them on for a whole month (there are about 720 hours in a month) I would use 241.2 kilowatt-hours (0.335 X 720). At my electrical rate of 10.6 cents ($0.106) per kilowatt-hour, the electricity needed will cost me $25.57 (= $0.106 X 241.2).

User Avatar

Wiki User

15y ago
This answer is:
User Avatar
More answers
User Avatar

Wiki User

13y ago

You don't "use" kilowatts; you use kilowatt hours. The kilowatt is an unit of power. Power is the rate at which you use energy which, for electrical installations, is measured in kilowatt hours. Put another way, power is an instantaneous value, whereas energy is an accumulated value. In simple terms, you can compare 'power' with 'miles per hour', and 'energy' with 'miles travelled'.

This answer is:
User Avatar

User Avatar

Wiki User

12y ago

This is a hard question to answer because of the amount of variables in the equation. If it is a small home with a small distribution panel, the home is limited as to what appliances can be used. This lowers the wattage uses and is reflected on your power bill. Larger homes will have more electrical circuits and usually more appliances there by using more electricity. If the personality of the homeowner requires lights to be on when it is dark, in rooms that are not occupied, this drives the cost up. Without knowing the specifics of the home an answer is not forth coming. An example could be, 1800 sq ft home, 1650 kWh per month times the utility rate of .09 cents a kWh will equal 148.50 dollars a month.

This answer is:
User Avatar

User Avatar

Wiki User

12y ago

No electrical appliance uses "kilowatts per hour". The question is meaningless and should be rephrased as "How many kWh (kilowatt hours) per hour does a home computer use?", or, better, "How many kW (kilowatts) does a home computer use?"

Typical home computers use around 100 Watts (one tenth of a kW), which is, of course, one tenth of a kWh per hour. This is around 55 W for the computer and 45 W for the monitor. Printers take extra. Many computers have energy saving modes when they are not being used and their consumption may then drop to 10W (one hundredth of a kW) or less.

This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: How many kW per hour does a Home computer use?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Continue Learning about Natural Sciences