WILD WEST


Meaning of WILD WEST in English

[Wild West] n (1849): the western U.S. in its frontier period characterized by roughness and lawlessness -- Wild West adj

Merriam-Webster English vocab.      Английский словарь Merriam Webster.