answersLogoWhite

0

The benefits to being in nature in the sunshine is getting vitamins essential to your skin such as vitamin D. The sun has healing and therapeutic powers that are needed by our skin. By being in nature our skin is also protected from the negative effects of sun exposure.

User Avatar

Wiki User

12y ago

What else can I help you with?

Related Questions