answersLogoWhite

0

Physical education was officially recognized in the United States in the early 1800's. Colleges and Universities began to offer physical education programs throughout the 1800's. Finally, in 1866 California was the first to mandate physical education.

User Avatar

Wiki User

16y ago

What else can I help you with?