Ten cents is one tenth of a dollar, so you would write it as .10, which is also the same as 100 (cents) divided by 10.
0.10
0.10
3.5 cents in decimal is 0.035
A nickel is worth 5 cents, which is written as $0.05. Coins are all fractions of dollars.For example, a dime is worth 10 cents, and 10 cents is written $0.10..A quarter is worth 25 cents, written $0.25.
10^-1 = 0.1
1.77 cents in decimal form is 0.0177
Well since 10% of 10 dollars is 1 dollar then 1% of 10 dollars would be .10 cents
0.10
10 cents is 10. cents 4 cents is 4. cents.
.05
Forty cent as decimal
Put the numbers in a column with the decimal points aligned. Add the numbers ignoring the decimal points. Insert the decimal point in the answer exactly below the column of decimal points in the numbers being added (summands).
10 cents is the same as .10 .10 x 1,000,000,000 = 100,000,000. When you multiply anything by 10, you are just adding a zero at the end of the second number. 1,000,000,000 x________.10 10,000,000,000 You have to move the decimal to the left two places because you have two numbers after the decimal in .10 you end up with 100,000,000. If you want THAT in cents, move the decimal back over to the right two spots to get 10,000,000,000 cents.
A nickel is worth 5 cents, which is written as $0.05. Coins are all fractions of dollars.For example, a dime is worth 10 cents, and 10 cents is written $0.10..A quarter is worth 25 cents, written $0.25.
3.5 cents in decimal is 0.035
10^-1 = 0.1
3 cents in decimal
Two nickels is 10 cents or 0.10
Well since 10% of 10 dollars is 1 dollar then 1% of 10 dollars would be .10 cents