answersLogoWhite

0

There is no easy answer to this.

Firstly, it depends on the fridge. Obviously a huge double-door refrigerator will consume a lot more energy than a tiny bar fridge.

Also, they only consume a lot of energy when the compressor is running. The compressor will turn on when the thermostat (ie: temperature sensor) shows that the temperature inside the fridge has gone up too far. How quickly this happens (and how often) will depend the temperature outside the fridge, compared to what temperature you've set the dial to inside.

So, a fridge will consume far less energy on a Winter's night, than it would in the middle of a hot Summer's day. And it will consume more energy the lower you set the temperature dial, because it has to work harder to keep the fridge colder.

Having said all of that, an average 16 cubic-foot frost-free fridge has a consumption rate of around 700 watts. But, it might only be operating at the full 700 Watts, say, 20% of the time. So, your power consumption in that case would be 140 watts per hour, or 0.14 KWh (kilowatt-hours - the correct term for power consumption).

Generally, for a new fridge in the store, you can look at the sticker on the fridge, which will tell you its consumption rate in KWh. Or, if it's an old fridge you already own and the stickers are all gone, you could buy a power consumption meter from the hardware store. Basically, you unplug the fridge (oven, TV, whatever), plug in the meter, then plug the fridge into the meter. Run it for a couple of days, and the meter will tell you how much power your device has been consuming.

To further complicate things, the wattage rating actually depends on the power source used. In America, you have 110 volts, whereas in Australia, for example, the voltage is 240. This actually changes the wattage calculation, so the same fridge will have difference consumption ratings in different countries. The 700 watt example above was obtained from the US Department of Energy, so it applies to a 110-volt power supply.

I told you there was no easy answer!!!

User Avatar

Wiki User

14y ago

What else can I help you with?

Related Questions

How many watts in a 115 volt refrigerator?

How many Amps is the fridge pulling? Multiply the Amps by the 120V circuit you're plugging into and you'll get your Watts.


How many amps are in 40 volt amps?

Multiply the vots by the amps to find the volt-amps. Or divide the volt-amps by the voltage to find the amps.


How many volts amps required a 12 volt audio radio?

To determine the amperage required by a 12-volt audio radio, you would need to know the power consumption of the radio in watts. You can then use the formula: Amps = Watts / Volts to calculate the amperage.


What is the formula for finding out how many amps is required to achieve the full 100 watt output from a 9 volt bulb?

The formula is volts times amps equals watts, or watts divided by volts equals amps.


How many volt amps in 12 volt battery?

Depends on the battery. It is listed on the battery as Cold Cranking Amps (CCA).


9 volt power supply in place of a 10 volt?

Probably ok if the new supply can produce the required amount of current in amps.


How many amps to charge a Chevy Volt?

50


How many amps will a 0.25 kilo volt amp transformer at 120 volts put out?

2.083 amps


How many amps will a 240 volt single phase 15 kva generator supply?

62.5 amps


How many amps can a 40 va 120 volt transformer with a 24 volt secondary carry?

To determine the current in amps that a 40 VA transformer can carry on its secondary side, you can use the formula: Amps = VA / Voltage. For a 24-volt secondary, the calculation is 40 VA / 24 V, which equals approximately 1.67 amps. Therefore, the transformer can carry about 1.67 amps on its 24-volt secondary.


How do you compute the 2000 watts in 220 volt to how many amps?

watts = volts * amps--> Amps = watts/ volts therefore; 2000/220= 9.09 amps


How many amps in a kilowatte?

at 240/50 volt itequates to 4amp