WEST GERMANY


Meaning of WEST GERMANY in English

ˌWest ˈGermany BrE AmE ( also the Federal Republic of Germany )

a former country in western Europe, between France and East Germany, whose capital city was Bonn. In 1949 Germany was split into two countries: the western part became West Germany or the Federal Republic of Germany, and the eastern part became East Germany or the German Democratic Republic, a communist country. The two countries joined together again in 1990 to become Germany, after the fall of the Berlin Wall.

—West German noun , adjective

Longman Dictionary of Contemporary English.      Longman - Словарь современного английского языка.