answersLogoWhite

0


Best Answer

I believe this answer to be right allthough I am far removed from being extremely proficient at math. I would begin by making the smallest number I am dealing with a whole number in this case take 1us (one microsecond) and make it a whole number or one in which the decimal is to the right of your natural integer. (take 1us which is .000001 and make it 1.0) Now you must make 1ms and do the same(take 1ms which is .001 and move the decimal place accordingly so that it matches the number of decimal places you moved for 1us) .001=1000. Now divide the way you would with any other set of natural integers. 1000/.1=.001 Now return the decimal place to its original location by going the opposite number of decimal places as we did earlier and you get .000000001 or one picosecond. I am pretty sure this is correct and hope this answers your question. If anyone feels I am wrong please point at my mistakes.

User Avatar

Wiki User

15y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: How do you divide one microsecond by one millisecond?
Write your answer...
Submit
Still have questions?
magnify glass
imp