WEST


Meaning of WEST in English

(COUNTRIES) [noun] [U] - the West North America, those countries in Europe which did not have communist governments before the 1990s, and some other parts of the worldEast-West relationsThere has been concern in/throughout the West about the effects of this measure.

Cambridge English vocab.      Кембриджский английский словарь.