(n.) Formerly, that part of the United States west of the Alleghany mountains; now, commonly, the whole region west of the Mississippi river; esp., that part which is north of the Indian Territory, New Mexico, etc. Usually with the definite article.
WEST
Meaning of WEST in English
Webster's English dictionary. Английский словарь Webster. 2012