A miliamp is one one thousandth of an ampere. So, the difference is that a miliamp is much smaller than an ampere.
2.857 AMPS
1,000 miliamps equals to 1 Amp.
1 Amps = 1000 miliamps 0.01 Amps = X x= 0.01 X 1000 = 10 miliamps
10 milliamps is equivalent to 0.01 amps (10 milliamps = 0.01 amps).
50 milliamps is equal to 0.05 amps.
you just did. you could change to amps -- 0.6 amp = 600 milliamps
7 amps
34.539 miliamps is only 0.034539 amps. A 16 gauge wire will handle that.
There are 1,000 miliamps in 1 amp. As the NEC limits you to loading a lighting circuit to no more than 80% you can have 16 amps or 16,000 miliamps on that circuit. That would mean you can have 2,000 lamps of 8 miliamps each.
The main difference between 5 amps and 10 amps is the amount of current flowing through a circuit. 10 amps is double the amount of current compared to 5 amps, which means a 10 amp circuit can handle twice as much power without overloading.
You also need to know the Voltage and wattage. Amps= Watts / Volts. Try this iPhone App "Watts2Amps"
The difference between 220 amps and 240 amps lies in their amperage ratings. 220 amps denotes a current capacity of 220 amperes, while 240 amps indicates a current capacity of 240 amperes. The higher the amperage, the greater the power capacity for electrical devices.