It depends entirely on your type of monitor and power supply, but you can figure on the average of 100 watts per amp (that's a generalization of course). If you have for instance, a 350 watt power supply, that means the power supply can put out 350 watts of power for the computer to use, but that's the low voltage components of the power supply.
As an example, I have a CRT type of monitor that uses 2.5 amps but on my power supply, there isn't a rating tag, so I'd approximate it at 1 to 2 amps.
People tend to confuse output watts as the draw of power from the AC voltage, but there are other variables that come into play.
Yes, those variables are power supply efficiency, PFC but mostly follow ohms law.
That varies depending on the computer (see the label on your computer for its ratings):a very tiny desktop computer that I have draws about 1Aan old quadcore desktop computer that I have draws about 12Aetc.
100 amps
23
amps like.. amplifiers? it depends on how many speakers you have. or amps like.. current draw? again. depends on your power needs, your power amps... ect
1100 watts or about ten amps then another 3 to 4 amps for turn table light and fan
It is drawing .06 amps.
Amps for an oven are governed by the total wattage of the oven and what the voltage supply to the oven is.
A dishwasher typically draws around 10-12 amps when in operation.
1 AMP
A deep freezer can draw between 6 to 8 times its running amps on start-up, depending on the model and size of the freezer. For example, if a freezer runs at 6 amps, it could draw between 36 to 48 amps when starting up.
It would be at least 250 amps, maybe 300 amps.
Each 32-watt bulb in a 48-inch fluorescent light typically draws around 0.27 amps. Therefore, a two-bulb setup would draw approximately 0.54 amps in total.