Want this question answered?
it is 795 divided by 120. <<>> The formula you are looking for is I = W/E, Amps = Watts/Volts.
To find the resistance necessary, one would need to know how much current the bulb draws. If one knows the current the bulb draws, then one would subtract the 14 volts from 120 volts then divide that by the current the bulb draws and one will find the resistance needed. Once this has been done, one would need to multiply the current drawn by the voltage drop to get the wattage rating necessary. Another important detail to note is that the power dissipated by the resistor will be much greater than the power consumed by the bulb itself. Finally if the bulb burns out the voltage across the contacts will be 120V. I would not recommend using this method to drop the voltage for the bulb.
An electric clothes dryer demands 22A from a 240V outlet at approximately 90% Power Factor. The power demand on the outlet should be about 240V x 22A x 0.9 = 4.75 kW. The active components in an electric clothes dryer are the heating element (100% PF) and the electric motor that turns the tumbler (70-80% PF). The formula you are looking for is W = I x E. (W is watts, I is current in amps, E is volts)
Basically, Power = Current*Voltage Current = Power/Voltage Current = 15/120 Current = 0.125A or 125mA
Amps (current) times volts = watts. so watts divided by volts = current (Amps). i.e.- 0.5 Amps.
The fuse is what restricts how much current you can draw from an outlet. If you have the wrong fuse you can draw more power than intended from an outlet - but only if the appliance plugged in can use it up. To get too much power out of an outlet, with a suitable appliance plugged in, then you also have to have a higher voltage in the outlet. And for that to happen, something has to be seriously wrong higher upp in the supply chain.
Power is multiplication of voltage and current. You need to know the load current drawing to establish the power. It is a bad idea to use 230 v adapter in 120 volts outlet
If the terminal voltage decreases when more current is drawn, that is due to the internal resistance of the power supply. Every power supply has a limit to how much current can be drawn. It is limited by the internal resistance and due to ohms law the more current drawn through a resistor, then the more voltage is produced across it. This is in opposition to the terminal voltage and is subtracted from it.
it is 795 divided by 120. <<>> The formula you are looking for is I = W/E, Amps = Watts/Volts.
A 120V household electrical outlet supplies 12 watts of powerwhen the current is 0.1 Ampere (and the power factor is 100%).
12ooo
None
How much you got paid for a year
6
No ~ the power supply is capable of supplying 5 Amps, but the device will only draw 1 Amp. Therefore the power supply can still safely supply another 4 Amps if required.AnswerIt's important to understand that the '5 A' referred to in the specification of your power supply is its capacity -the actual current drawn from the power supply is determined by the load you attach to it. So, if your load requires only 1 A, then that's how much current will be drawn from the power supply.
To find the resistance necessary, one would need to know how much current the bulb draws. If one knows the current the bulb draws, then one would subtract the 14 volts from 120 volts then divide that by the current the bulb draws and one will find the resistance needed. Once this has been done, one would need to multiply the current drawn by the voltage drop to get the wattage rating necessary. Another important detail to note is that the power dissipated by the resistor will be much greater than the power consumed by the bulb itself. Finally if the bulb burns out the voltage across the contacts will be 120V. I would not recommend using this method to drop the voltage for the bulb.
I don't know the maximum amount of equipment can be hooked up to an outdoor power outlet, but I'm pretty sure you can ask any hardware store for that type of information.