answersLogoWhite

0

It really depends on the computer. If you have an old-style CRT type monitor that stays on too, it may use far more power than the PC. You can figure the number for your particular setup, here's how: First, electrical consumtion is actually measured in kilowatt-hours, not kilowatts. A kilowatt is one-thousand watts. If you have a device that draws 1,000 watts, and you leave it on for one hour, you have consumed (and will have to pay for) one kilowatt-hour. Sometimes appliances are rated in watts, sometimes in amps. Simply multiply amps times 120 (assuming your in the US) to get watts. 100 watts is 0.1 kilowatts, 500 watts is 0.5 kilowatts, 1,230 watts would be 1.23 kilowatts. 3 amps is 0.36 kilowatts (3 X 120 = 360 watts = 0.36 kilowatts). Get it? Here's my setup as an example: My PC is a laptop. The power supply says 95 watts. That's 0.095 kilowatts. I put the laptop in a docking station so I can use a real keyboard and monitor. My monitor says 2 amps. That's 0.24 kilowatts (2 X 120 = 240 watts). See, my monitor actually uses more electricity than the laptop! That's why most modern monitors have a power-save feature where they turn off after a period of inactivity. So, the total power needed is 0.335 kilowatts (0.24 + 0.095). If I leave them on for a whole month (there are about 720 hours in a month) I would use 241.2 kilowatt-hours (0.335 X 720). At my electrical rate of 10.6 cents ($0.106) per kilowatt-hour, the electricity needed will cost me $25.57 (= $0.106 X 241.2).

User Avatar

Wiki User

16y ago

What else can I help you with?