The voltage across a DC device that draws 2A and consumes 12Wh/h is 12/2 or six volts.
Ohm's law: Voltage = Current times Resistance Solve: Resistance = Voltage divided by Current So, a device drawing 50ma with 150V has a resistance of 150 / 0.05, or 3000 ohms. p.s. Since power is volts times amps, that device is dissipating 7.5 watts.
Exactly...you answered your own question. Each DC bulb will drop voltage according to its resistance and the amount of current it draws.
thanks to ohm,,who invented the answer.according ohms law,V voltage=R(element)xI amps.then.....answer is....100 volts.
A device that wants to be supplied with 4v, will be destroyed if connected to 12v. Do not confuse voltage with Amp/Hour capacity, which is what I think you are looking for. Make sure the voltage matches the device. Then determine how much current the device draws. The power supply will be rated as to how much current it can deliver and for how long.
Wire sizing of a feed conductor is based on the amperage that a device draws. To calculate amperage from KVA a voltage of the supply has to be stated. Without this voltage and whether the transformer is single or three phase an answer can not be given.
Current equals power divided by voltage, so with 110 V across the load, a 900 W system draws about 8.18 A (if the voltage potential is double, 220 V, the current is half, 4.09 A).
CMOS is better than single MOSFETs because the complementary MOSFETs in CMOS always have one off and the other on, reducing the idle current to only leakage current and the output voltage exactly equal that of either the power or ground as there is no voltage drop across the MOSFET that is on. With just one MOSFET the device draws current anytime it is in the on state, even if idle.
The supply device must have an output voltage that matches that of the load, and a current rating that exceeds that of the load. So you cannot use a load that draws 2 A from a supply device that is rated at only 1 A.
Assume the supply as DC (Only resistance given) Voltage drop = 10X10X0.12 = 12V (approx)
There are zero watts in 7.5 amps. Watts are the product of amps times volts. W = A x V. As you can see from the equation a voltage value is missing from your question. Once a voltage value is added to the equation you can find the wattage of the device that draws 7.5 amps.
Yes - what matters is the voltage - it has to be the same. The device will only draws 1A as needed - the adaptor with higher amp,i.e, 2A is fine. As long as the adaptor has amps equal to or greater than those of the device, it will be ok!
225W