WEST COAST


Meaning of WEST COAST in English

the name commonly used for the states on the west coast of the US, especially California . To many people the West Coast suggests a place that has sunny weather most of the time, where the people have a relaxed way of life and often invent or follow new fashions, particularly those involving physical fitness or psychological help.

Oxford guide to British and American culture English vocabulary.      Руководство по британской и американской культуре, Оксфордский английский словарь.