Based on Ohm's Law, V = I*R, you would need to know the resistance along with one other parameter or the equation cannot be solved. However, if you are able to develop a set of independent linear equations based on the rest of the system surrounding that resistor using Kirchoff's Voltage Law or Kirchoff's Current Law, then you may still be able to solve for the unknown in the system.
Ohm's law states that voltage is resistance times current. In a resistor circuit, knowing two of voltage, current, or resistance, you can calculate the third.Actually, this applies to any circuit, be it resistor, capacitor, or inductor. Ohm's law still applies - it just gets more complex when the phase angle of current is not the same as the phase angle of voltage.
trh
Don't follow what a '2200watt resistor' is. A resistor spec is measured in ohms. Ohms Law is expressed as: Voltage drop = current x resistance, and the wattage of the resistor is = volts drop x current. You have to decide if your resistor is 2200 ohms, or is taking 2200 watts. These two alternatives will give different results for the current. If it is 2200 watts, at 110 volts, the current is 20 amps. If it is 2200 ohms, at 110 volts, the current will be 50 milliamps. (0.05amps)
To calculate the wattage of a device, you need to know both the current (in amperes) and the voltage (in volts). If the voltage is not provided, you cannot accurately calculate the wattage. In this case, with only the current (4 A) given, you cannot determine the wattage without knowing the voltage as well.
Current flow can be calculated using Ohm's Law, which states that current (I) is equal to the voltage (V) divided by the resistance (R), represented by the formula I = V/R. By measuring the voltage across a circuit and knowing the resistance, you can calculate the current flowing through it.
An ammeter, either shunt or inductive. A shunt is an inline resistor of a small, known resistance. Knowing the resistance and the voltage one can calculate the amperage by Ohms law, I=V/R. An inductive, or clamp on ammeter measures the magnetic field and using more complex calculations displays the current, typically on a digital display.
To calculate the voltage, you need to know the current (amperage) as well. The formula to calculate power (watts) given voltage (volts) and current (amps) is: Power (P) = Voltage (V) x Current (I). Without knowing the current, it is not possible to directly convert watts to volts.
Ohm's Law - V = IR.
In order to determine what size of resistor is required to operate an LED from a 9V battery, first start by knowing the current and voltage required for the LED. That information is available in the LED's specifications. For discussion purposes, lets assume a typical LED at 2.5V and 50mW. The translates to a forward current of 20mA. Build a simple series circuit containing a 9V battery, a resistor of an as yet unknown value, and the LED. By Kirchoff's current law, the current in the LED is the same as the current in the resistor, which is also the same as the current in the battery. This is 20ma. By Kirchoff's voltage law, the voltage across the LED plus the voltage across the resistor equals the voltage across the battery. This is 6.5V. (9 - 2.5) By Ohm's law, resistance is voltage divided by current, so the resistor is 6.5 / 0.02, or 325 Ohms. The nearest standard value to that is 330 Ohms. Cross check the power through the resistor. Power is voltage times current, or 6.5V times 0.02A, or 0.13W. A half watt resistor is more than adequate for this job.
There can be no answer for this without knowing how strong a resistor is involved. (Ohms)
There's no way to tell with the given information. "5 W" tells the maximum power the resistor can dissipate without overheating and possibly becoming damaged. That rating doesn't tell you anything about the power it's actually dissipating, or its resistance. Knowing either of those numbers in addition to the current through it would allow you to calculate the voltage across the component. The voltage across it is (5 times its resistance) or (0.2 times the power it's dissipating).
The power rating of a resistor is determined by its physical size. The greater its surface area, the better it can dissipate energy, so the higher its power rating. Knowing its power rating and its resistance will determine the maximum voltage that can be applied to it in order to ensure the resulting current doesn't cause the resistor to overheat. This can be determined by manipulating the equation, P = U2/R.