ELECTRICITY
Note: Some power supplies operate on 120 volts 60 Hz ac but others operate on 230 volts 50 Hz ac.
Which voltage your power supply uses depends on which country you are in.
For more information see the answers to the Related Questions shown below.
if it receives 120 volts then that means it voltage.
Voltage times current equals power (for constant direct current).
ELECTRICITY :\
A personal computer's power supply receives 120 volts of AC and converts it to 3.3, 5, 12 volts of DC power.Another answerThe secondary voltage could be any number of voltages depending on what the power supply was designed for. Your best bet is to get a DC voltmeter and measure the output voltage of each wire on the output side.
It depends on the type of power supply. A power supply used in a desktop computer is generally supplied with AC (Alternating Current) and produces several levels of regulated DC (Direct Current) voltages required by the electronic circuits.
150 / 120 = 1.251.25 x 100 = 125answer:125%150 volts is 125 percent of 120 volts.
Yes. If the power supply is of a low wattage and has too many pieces of hardware connected to it, it can. Try buying a power supply with a wattage over 300. One thing to check before replacing the power supply is the voltage selector in the back. If you use 120 volt power, then setting the power supply to 240 volts will cause the PC to only get half of the power it needs. External peripherals with their own power supply will not impact the power supply or current in the computer. Printers and monitors usually get their power from the wall socket, not the computer, and the same goes for external modems that plug into the wall. Keyboards and mice take negligible power.
Electricicity, a power source, a plug
A power supply receives 120 volts of AC power from a wall outlet and converts it to 3.3, 5, and 12 volts of DC power.
Power (watts) = current (amperes) * voltage (volts) Current (amperes) = voltage (volts)/resistance (ohms) 120 watts = current * 120 volts current = 1 ampere 1 ampere = 120 volts/resistance resistance = 120 ohms
Volts * Amps = Watts 120 Volts * 12.5 Amps = 1500 Watts Doesn't sound like a good idea.
A personal computer's power supply receives 120 volts of AC and converts it to 3.3, 5, 12 volts of DC power.Another answerThe secondary voltage could be any number of voltages depending on what the power supply was designed for. Your best bet is to get a DC voltmeter and measure the output voltage of each wire on the output side.
Not unless it is rated for that voltage. You can likely find a step down transformer from 277 volt sto 120 volts.
Household electricity is supplied to a pc's power supply module as alternating current (ac).The power supply converts it into direct current (dc) at various voltages - such as +12V , +5V, etc. - which are then delivered to the pc's subsystems.Some power supplies operate on 120 volts 60 Hz ac but others operate on 230 volts 50 Hz ac.Which voltage your power supply uses depends on which country you are in.For more information see the answers to the Related Questions shown below.
120 power flows through a circuit with 1 amp and 120 volts.
You need to check the currents in amps that each appliance takes, and the current in amps that the supply can supply. The divide one by the other and there is the answer. Going by the power instead, a 120 v 10 amp supply can give 1200 watts.
The voltage your computer's power supply receives depends mainly on the standard electrical service delivered into homes, offices and factories by the national electrical grid in your country.Just a few examples: if you live in North America, that voltage is 120 volts but, if you live in, say, Europe or Australia, that voltage is 230 volts.More detailsThe voltages a power supply was designed to use depends on how old it is. Early power supplies were designed to use only one voltage: the standard voltage of the electrical service delivered by a country's national electrical grid for use by small appliances in homes, offices and factories .Since the invention of the personal computer most standard power supplies have been designed to operate on a wide range of voltages. (Such as from 100 volts to 250 volts.)
Using the Electrical Power Law, which is:The current (measured in amps) equals the power (measured in watts) divided by the potential difference (measured in volts)So a light bulb designed to use 60 watts of power when supplied with 120 volts must draw 60 watts divided by 120 volts, which is a current of 0.5 amps.The same answer could be expressed in a few different ways:500 milliwatts500 mW"1/2 an amp" !
In the US a general purpose receptacle outlet would be 120 volts; England 240 volts, France 115 volts, Libya 127 volts, Okinawa 100 volts, Tanzania 230 volts . . .; all depends on where your house is.
V=IR, V=120 VOLTS R=30 OHMS I=V/R, I=120/30, I = 4Amps