answersLogoWhite

0

10 ma times 50 ohms is 0.5 volts. 0.5 volts is one two hundreth of 100 volts, so the multiplier resistor on 200 time 50, or 10,000 ohms.

User Avatar

Wiki User

12y ago

What else can I help you with?

Continue Learning about Electrical Engineering

What resistance value is add to convert a galvanometer into a voltmeter?

It depends on the resistance of the galvanometer and the current required to reach full scale. A 100 ohm meter requiring 1 milliampere would require 99.9 KOhms in series to become a 100 volt voltmeter.


How do you find the voltage drop over a resistor?

To find the voltage drop over a resistor, you measure it with a voltmeter connected across the resistor. You also need to make sure the impedance of the voltmeter is high enough to not shift the effective resistance more than the required accuracy of the measurement.


Why does an ammeter have a low resistance while a voltmeter has high resistance?

I am going to assume that you mean low "resistance" in an open circuit test and are performing this with a multimeter. An ammeter works by place a very small amount of resistance in series with a circuit and then measuring the Voltage drop across the resistance. The Voltage is directly proportional to the current as given in ohms law: E = I x R If you are measuring the resistance through the ammeter it will have a very low resistance and impedance.


What type of meter would be connected on both sides of a resistor in a circuit?

ammeter in series at any side as required since it is bilateral and voltmeter is connected in parallel to measure voltage drop across it


Can you measure power with a volt meter?

No. A voltmeter measures potential difference (voltage). To measure power, a wattmeter is required. On the other hand, for a d.c. circuit only, you could use a voltmeter and an ammeter, and multiply their readings in order to calculate the power of a load.

Related Questions

What resistance value is add to convert a galvanometer into a voltmeter?

It depends on the resistance of the galvanometer and the current required to reach full scale. A 100 ohm meter requiring 1 milliampere would require 99.9 KOhms in series to become a 100 volt voltmeter.


How do you determine internal resistance of voltmeter?

By Ohm's law, resistance is voltage divided by current, so you can determine the resistance of a voltmeter by measuring the total current required to drive it to full scale on each range. In typical digital voltmeters, the resistance is fixed at 11 or 20 megohms by a resistor divider. This is not often affected by range, because the op-amp that picks up the divided signal contributes negligible resistance to the divider. In typical analog voltmeters, the resistance is a function of the resistance selected by the range that is placed in series with the meter movement. An example, for a 50 microampere movement is typically 20,000 ohms per volt, so you simply multiply the selected full scale range by 20,000 to get the resistance.


What is a word for a used to measure the power required to operate a device?

voltmeter


How do you find the voltage drop over a resistor?

To find the voltage drop over a resistor, you measure it with a voltmeter connected across the resistor. You also need to make sure the impedance of the voltmeter is high enough to not shift the effective resistance more than the required accuracy of the measurement.


If the money multiplier is 4, what is the required reserve ratio (RRR)?

25 percent


What is the least common multiplier of 62?

This cannot be answered properly because at least two numbers are required.


How is a voltmeter connected in the circuit to measure the potential difference between two points?

The voltmeter is connected in parallel between the two points whose potential difference is required.


What value of resistance is used to convert galvanometer into voltmeter?

It depends on the resistance of the galvanometer and its full scale current. A 100 ohm meter reading 1 milliampere would require 0.1 volts to reach full scale, so it would require about 0.1 ohms in parallel to become a 1 ampere ammeter.


What do you use to measure resistence?

There are two ways to measure resistance in an electrical circuit.Firstly, one can use a voltmeter and an ammeter simultaneously. Take note that the ammeter is to be placed in series with the object of interest while the voltmeter should be placed in parallel to the object of interest. Since R=V/I, simply divide the voltage reading by the current reading and you will obtain the value of resistance in the object of interest in the unit ohm.The alternative method can only be employed by science organisations and research industries for the equipment required is very advance and expensive. An ohmeter (instrument measuring resistance directly) can be used.Hope that helps!


What describes how Lowering the required ratio increases the money supply?

Makes the deposit multiplier bigger. - Dustin SELU


What accurately describes how Lowering the required ratio increases the money supply?

Makes the deposit multiplier bigger. - Dustin SELU


Why does an ammeter have a low resistance while a voltmeter has high resistance?

I am going to assume that you mean low "resistance" in an open circuit test and are performing this with a multimeter. An ammeter works by place a very small amount of resistance in series with a circuit and then measuring the Voltage drop across the resistance. The Voltage is directly proportional to the current as given in ohms law: E = I x R If you are measuring the resistance through the ammeter it will have a very low resistance and impedance.