WEST


Meaning of WEST in English

(n.) Formerly, that part of the United States west of the Alleghany mountains; now, commonly, the whole region west of the Mississippi river; esp., that part which is north of the Indian Territory, New Mexico, etc. Usually with the definite article.

Webster's English dictionary.      Английский словарь Webster.