answersLogoWhite

0

The cost of using a 70 watt device depends on your locations and the rate of electricity charged by your utility provider. To calculate the cost, you can use the formula: (watts / 1000) x hours of use x cost per kilowatt-hour.

User Avatar

AnswerBot

1y ago

What else can I help you with?

Continue Learning about Natural Sciences

How much does it cost to run a 5 watts transformer?

If the transformer uses 5 watts per hour you need to know what you are paying per 1000 watts from your power company. If you pay lets say $3.00 for 1000 watts then when your transformer burns 1000 watts it cost you $3.00 your cost will be $3.00 for 200 hours run time.


How many amps with 70 watts 13.8 Volts dc?

To calculate the amperage, we can use the formula: Amps = Watts / Volts. In this case, it would be 70 watts / 13.8 volts ≈ 5.07 amps. So, with 70 watts and a voltage of 13.8 volts DC, the amperage would be approximately 5.07 amps.


How much does 350 watts cost?

To determine the cost of using 350 watts, you would need to know the cost per kilowatt-hour (kWh) from your utility company. Once you have that information, you can calculate the cost by converting watts to kilowatts (350 watts = 0.35 kilowatts) and then multiplying by the cost per kWh.


How many watts in 820 lumens?

The number of watts needed to produce 820 lumens depends on the type of light source. For traditional incandescent bulbs, it would be around 60-70 watts. However, for LED or CFL bulbs, it would be much lower, around 10-15 watts.


How much will it cost to run a 185 watts per day?

To calculate the cost of running a device that consumes 185 watts per day, you need to know the cost of electricity per kilowatt-hour. Assuming an average cost of $0.12 per kWh, running a 185-watt device for 24 hours a day would cost about $0.50 per day (185 watts / 1000 * 24 hours * $0.12).