It appears there may be a typo in your question. If you meant naturism, it is a lifestyle advocating for nudity and the belief that individuals should embrace a naturalistic approach towards their bodies. Naturists typically enjoy being nude in designated settings like nude beaches or resorts, and prioritize body acceptance and comfort.