answersLogoWhite

0


Best Answer

any microwave, exept for the ones that need more, should be specified. But those microwaves are not consumer products. I have a 1500 watt on a 15 amp, BTW.

User Avatar

Wiki User

12y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: What wattage of microwave can be used in a 15 amps?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Continue Learning about Natural Sciences

Can you safely use a 15 amp circuit for a microwave rated at 12 amps?

Yes, the microwave draws less that the circuits protective rating and will not trip the breaker if used on a 15 amp circuit.


How many amps does a small microwave use?

13 amps should be a dedicated outlet since one outlet has a maximum capacity of 15 amps <<>> If the supply voltage is 120 volts then the amperage is I = W/E. Amps = Watts/Volts = 1450/120 = 12.08 amps.


What gauge wire is needed for an 19 bulb chandelier?

To answer this question the total wattage of the fixture is needed. To find this you need to find out the wattage of one bulb. If all of the bulbs are the same multiply the wattage of the bulb times 19. Use this total wattage in the following equation. I = W/E, Amps = Watts/Volts. In North America a #14 conductor is rated at 15 amps but has to be de-rated for a continuous load. This allows the conductor to carry 12 amps legally.A #12 conductor is rated at 20 amps but has to be de-rated for a continuous load. This allows the conductor to carry 16 amps legally.A #10 conductor is rated at 30 amps but has to be de-rated for a continuous load. This allows the conductor to carry 24 amps legally.Hopefully the chandelier falls within this amperage range


240v 60hz how many amps?

Volts and frequency are two different values. Amps are found by the following formula; I = E/R, I = W/E, I = square root of W/R. None of these formula use 60 Hertz (frequency) in their calculations.In what wattage?Solving for amps is: A=watts/voltsIf you the given wattage and volts then you can compute for amps.Ex. A=120watts/110VoltsA=1.09Amps


How many CFL bulbs can you use on a single pole dimmer?

The number of CFL bulbs that can be used on a single pole dimmer depends on the wattage of the bulbs and the maximum load capacity of the dimmer. Each dimmer has a specified maximum load capacity, usually measured in watts, which indicates the total wattage it can handle. To determine how many CFL bulbs can be used, divide the dimmer's maximum load capacity by the wattage of the CFL bulbs being used. Make sure not to exceed the dimmer's maximum load capacity to avoid potential overheating and damage.

Related questions

If a microwave has 15 amps what is the wattage?

volts X amps = watts standard 120 X 15 amps =1800watts microwaves generally use about 1000 watts which is 8.33 amps...


Can you safely use a 15 amp circuit for a microwave rated at 12 amps?

Yes, the microwave draws less that the circuits protective rating and will not trip the breaker if used on a 15 amp circuit.


Does the microwave need a curcuit?

Yes, every Microwave should have it's own dedicated circuit as some consume 15-18 amps.


How many amps does a small microwave use?

13 amps should be a dedicated outlet since one outlet has a maximum capacity of 15 amps <<>> If the supply voltage is 120 volts then the amperage is I = W/E. Amps = Watts/Volts = 1450/120 = 12.08 amps.


What gauge wire is needed for an 19 bulb chandelier?

To answer this question the total wattage of the fixture is needed. To find this you need to find out the wattage of one bulb. If all of the bulbs are the same multiply the wattage of the bulb times 19. Use this total wattage in the following equation. I = W/E, Amps = Watts/Volts. In North America a #14 conductor is rated at 15 amps but has to be de-rated for a continuous load. This allows the conductor to carry 12 amps legally.A #12 conductor is rated at 20 amps but has to be de-rated for a continuous load. This allows the conductor to carry 16 amps legally.A #10 conductor is rated at 30 amps but has to be de-rated for a continuous load. This allows the conductor to carry 24 amps legally.Hopefully the chandelier falls within this amperage range


240v 60hz how many amps?

Volts and frequency are two different values. Amps are found by the following formula; I = E/R, I = W/E, I = square root of W/R. None of these formula use 60 Hertz (frequency) in their calculations.In what wattage?Solving for amps is: A=watts/voltsIf you the given wattage and volts then you can compute for amps.Ex. A=120watts/110VoltsA=1.09Amps


How many CFL bulbs can you use on a single pole dimmer?

The number of CFL bulbs that can be used on a single pole dimmer depends on the wattage of the bulbs and the maximum load capacity of the dimmer. Each dimmer has a specified maximum load capacity, usually measured in watts, which indicates the total wattage it can handle. To determine how many CFL bulbs can be used, divide the dimmer's maximum load capacity by the wattage of the CFL bulbs being used. Make sure not to exceed the dimmer's maximum load capacity to avoid potential overheating and damage.


Can a 1000volts microwave be connected to a 15amp receptacle?

To my knowledge there is no such a thing as a 1000 volt cooking microwave oven. If you mean 1000 watt then the answer to your question is yes. W = A x V. Presuming that the 15 amp receptacle is on a 120 volt system then the amperage draw on a 1000 watt microwave oven would be A = W/V 1000/120 = 8.3 amps with 6.6 amps to spare.


Can a 1100w mini oven safely be used plugged Ino a standard household socket?

A 1100 watt oven can be plugged into a standard household socket. Using the formula for computing wattage which is the voltage at the outlet, (120 volts A.C.) times the Amperage,(unknown, although you could look at the label on the back or bottom of the appliance) yoiu would get the wattage. Volts times amperage equals wattage. This can be looked at also as wattage divided by the voltage to get the amperage or current. If the amperage is below 15 it is okay to plug it in. Some kitchen outlets are rated higher like 20 amps but most outlets are 15 amp. 1100 watts divided by 120 volts gives you about 9 amps, well within the limits of the electrical circuit.


Can you use 5 60watt bulbs in a light fixture that is wired with 18 gauge wire?

Two thoughts here, one the fixture should be rated at the maximum wattage allowed for the socket the bulb screws into. A label should state "maximum wattage allowed". To do a calculation Watts = Amps x Volts. Amps = Watts/Volts. 5 x 60 watts = 300 watts. 300/120 = 2.5 amps. The electrical code only rates down to #14 wire which is rated at 15 amps. From here you have to make the decision.


Does a pool pump on 220 volts use the same amount of energy as a 110 volt?

Assuming it is also rated for 120V., yes. The wattage doesn't change with an increase or decrease in voltage. However, the current draw does. When you double the voltage a load is hooked up to, the Amperage draw (current) drops in half. Example: if a 240 volt heater draws 6 amps, it will draw 12 amps if connected to 120V. If a 120V heater draws 15 amps, it will draw only 7.5 amps when connected to 240V. But power, or wattage stays the same, regardless. And this is what is used to calculate energy usage and therefore, cost. Please note the above answer says "if it is also rated for"


How many amps does a 12 volt 600 watt transformer use at 120 volts?

A transformer does not use, it transforms voltage from one value to another. The output amperage is governed by the connected load. If the load wattage is higher than the wattage rating of the transformer then either the primary or secondary fuse will blow or the transformer will burn up if the fusing is of the wrong sizing. The maximum primary amperage can be found by using the following equation, Amps = Watts/Volts, A = W/E = 600/120 = 5 amps. The same equation is used for the calculating the maximum secondary amperage, A = W/E = 600/12 = 50 amps.