answersLogoWhite

0

A transformer,

: By appropriate selection of the numbers of turns, a transformer thus allows an alternating voltage to be stepped up - by making NS more than NP - or stepped down, by making it less.

Transformers are some of the most efficient electrical 'machines,with some large units able to transfer 99.75% of their input power to their output,Transformers come in a range of sizes from a thumbnail-sized coupling transformer hidden inside a stage microphone to huge units weighing hundreds of tons used to interconnect portions of national power grids. All operate with the same basic principles, although the range of designs is wide.

User Avatar

Wiki User

16y ago

What else can I help you with?

Continue Learning about Engineering

Explain the power rating of this bulb say 60W what if the voltage changes?

You can check it out yourself using these formulas. Watts = Amps x Volts. Voltage = Watts/Amps


What are causes of change in voltage of alternators when loaded?

actully when load of alternator fluctuates it changes the torque at primovers which changes amps so terminal voltage of alternator changes.


What is line interphase transformer?

interference line transformer display digit number in electerical device,such as generator it display amps rate&voltage etc.


How do you find the current draw of an electrical circuit if you do not know the voltage and wattage of the device?

Basically if you know the Voltage supply and the power used by an appliance then you use the formula for power which is Power = Volts x Amps. Rearrange so Amps (current) = Power / Volts If power was 2400 Watts and Volts was 240 the Current would be 2400 / 240 = 10 Amps


Can you use a power supply with more amps and not fry the unit?

you will need to be allot more specific on what you are trying to do here. what is the difference in amps. what is the device Generally speaking it is good practice to only use the power supply that the device is rated for. the biggest issue you will have is this Power = voltage * current (simple version) if the power supply you had was 12v at 1 amps then you ca supply 12Watts of power if the power supply you had was 12v at 10 amps then you can supply 120Watts of power Just because you can supply 10 amps, and all you need is one, means your power supply is bigger than it needs to be. The device will draw what it is intended to draw. Just make sure the voltage matches.

Related Questions

What is power in electricity?

Watts otherwise known as VA which is voltage X Amps


What in power in electricity?

Watts otherwise known as VA which is voltage X Amps


Explain the power rating of this bulb say 60W what if the voltage changes?

You can check it out yourself using these formulas. Watts = Amps x Volts. Voltage = Watts/Amps


Convert Volt-amps to total amps?

To find your amps, divide your volt amps listed by the voltage you are using (and the device is rated for).Power in watts = Volts X AmpsVolts equals amps X resistance. All of these formulas can be transposed to find the missing element.If something is listed as 360 Volt amps and the voltage used is 120 volts it draws 3amps. So if the same device was used on a 240 volt circuit it would draw 1.5 amps. the power company charges for power (watts) so the volt amps are listed on the device and costs you the same regardless of the voltage used.If the same thing was designed for 12 volts it would draw 30 ampsThe current in amps is equal to the apparent power in volt-amps divided by the voltage in volts:A = VA / V


How many amps can a 250 va transformer carry?

A transformer is a power source. It will provide voltage to a device. Find the voltage rating on the device, say 24V. 250/24 = ~10A.


How many amps does a device need?

The amps required for a device depend on its power consumption. You can calculate the amps by dividing the power rating (in watts) by the voltage (in volts) of the device. For example, a 1200 watt device plugged into a 120-volt outlet would require 10 amps (1200 watts / 120 volts = 10 amps).


How do you convert HV amps to LV amps?

To convert high voltage (HV) amps to low voltage (LV) amps, you can use the formula: HV amps = LV amps x (LV voltage / HV voltage). By rearranging the formula, you can calculate LV amps by dividing HV amps by the ratio of HV voltage to LV voltage.


How do car amps work?

Hi, Amps are electrical and the same in a car as elswhere. Amps are a quantity of electricity, voltage is the pressure pushing it and resistance is, as it sounds, how difficult it is to push it. Peace, crigby


How do you calculate kva from amps?

At what voltage? If you know the voltage then, to get the amps those kilovolt-amps contain, you simply divide the kilovolt-amps by the voltage.


How many KVA is 62 amps?

At what voltage? When you know the voltage then, to get the amps those kilovolt-amps contain, you simply divide the kilovolt-amps by the voltage.


How many watts is 7.5 amps?

There are zero watts in 7.5 amps. Watts are the product of amps times volts. W = A x V. As you can see from the equation a voltage value is missing from your question. Once a voltage value is added to the equation you can find the wattage of the device that draws 7.5 amps.


The device used to test electronic flow is what?

The flow of electricity (known as current) is measured in amps by an ammeter.