WEST Meaning of WEST in English (n.) A country, or region of country, which, with regard to some other country or region, is situated in the direction toward the west. Webster's English dictionary. Английский словарь Webster. 2012