That depends on the voltage.
To answer this question a voltage must be given. Watts = Amps x Volts. <<>> Answer At 115 volts ac, 30 amps equals 3,450 watts.
20 Amp * 120 Volts = 2400 Watts 2400 Watt * 80% max use = 1920 Watts planned normal usage for a circuit with a 20 Amp breaker.
Watts equals volts multiplied by amps. This would therefore be a five amp circuit.
Watts= voltage times amps. So if you divide Watts by voltage, you will get amps = .33333 or about a 1/3 amp load. This is assuming a 120 volt circuit.
2400 watts.
120 power flows through a circuit with 1 amp and 120 volts.
A 15a circuit can supply approximately 1650 watts, so 1650/65=25. I would stop at 20.
The electrical code states that circuit conductors that are fed by this breaker on a continuous load can only be loaded to 80%. Therefore you can have a load of 1,920 watts on this circuit. Assuming you install 8 watt bulbs you can have 240 on this circuit.
Any appliances that draw over 1500 watts should be on a 20 amp circuit.
The formula you are looking for is Watts = Amps x Volts.
2.3 kw per hour on a 110-120 volt circuit.
about 4800 watt but should not use it 100% so to be safe 4000 watt (80%)