WEST COAST


Meaning of WEST COAST in English

noun

[ sing. ] the states on the west coast of the US, especially California

Oxford Advanced Learner's English Dictionary.      Оксфордский английский словарь для изучающик язык на продвинутом уровне.