answersLogoWhite

0

Which coast does the sun set in the US?

Updated: 8/20/2019
User Avatar

Wiki User

12y ago

Best Answer

The sun always sets in the west, so in the U. S. that is the Pacific coast of California, Oregon and Washington state or the Gulf coast of Florida.

User Avatar

Wiki User

12y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: Which coast does the sun set in the US?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions