NATURISM


Meaning of NATURISM in English

noun the belief or doctrine that attributes everything to nature as a sanative agent.

Webster English vocab.      Английский словарь Webster.