YEAR IN REVIEW 1999: VIAGRA


Meaning of YEAR IN REVIEW 1999: VIAGRA in English

Viagra: A Second Honeymoon? The approval in March 1998 of Viagra (sildenafil), the first oral drug for male impotence, brought new hope to the millions who suffered from this condition and revitalized the joke repertoire of late-night-TV talk-show hosts. The number of Viagra jokes was outpaced only by the number of prescriptions written: more than six million during the drug's first seven months on the market. Toward year's end, however, the enthusiasm was tempered by a cautionary note, as the U.S. Food and Drug Administration (FDA) warned of potentially serious side effects in some patients. Viagra offered a novel way to treat male impotence, or erectile dysfunction, as it is known medically, a condition that affected an estimated 30 million men in the U.S. alone. Taken in pill form about an hour before sexual activity, Viagra improves blood flow to the penis and thereby allows a man to respond naturally to sexual stimulation. In clinical trials the drug was shown to restore sexual function in 7 out of 10 men. Previous treatments had involved surgical implants, suppositories, pumplike devices, and injection of drugs directly into the penis. Doctors were immediately swamped with requests for the new drug, which cost $8-$10 a pill. The overwhelming demand created some unanticipated ethical, legal, and economic dilemmas. Although a full medical examination is recommended prior to taking Viagra, a controversial new industry sprang up, offering consumers prescriptions via the Internet. The high cost led some insurers, both private and public, to refuse to cover the drug, whereas others, including some state Medicaid programs, placed some limits on coverage. Medical reports later in the year led the FDA to issue new warnings in November and prompted Pfizer, Inc., Viagra's manufacturer, to add them to its drug labeling for physicians. Although emphasizing that the drug was safe and effective for most users, the agency said that caution should be used in prescribing Viagra for some patients, including those who had recently suffered a heart attack or stroke and those with life-threatening arrhythmias, a history of heart failure or unstable angina, very low or very high blood pressure, or certain eye disorders. This warning was in addition to the original caution against Viagra use by people taking the drug nitroglycerin. The FDA also warned of the rare occurrence among some Viagra users of painful, prolonged erections requiring medical attention. As of mid-November the agency had received reports of 130 deaths among patients taking the drug, many from heart attacks, but there was no proof that the new medication itself caused the deaths. CRISTINE RUSSELL Tragic Optimism for a Millennial Dawning Wallace's Paradox As 1998 unfolded in the homestretch of our millennial countdown, I remembered that, exactly 100 years ago, the leader in my profession of evolutionary biology, then a new science dedicated to explaining the causes and pathways of life's ancient history, wrote a book to mark the end of the last century. Charles Darwin died in 1882, so leadership had fallen to Alfred Russel Wallace, who also had recognized the principle of natural selection in an independent discovery made before Darwin's publication. In The Wonderful Century: Its Successes and Failures, published in 1898, Wallace presented a simple thesis combining both joy and despair: The 19th century had witnessed such a spectacular acceleration of technological progress that innovations made during this mere hundred years had surpassed the summation of change in all previous human history. This dizzying pace, however, may do more harm than good because human morality, at the same time, had stagnated or even retrogressed--thereby putting unprecedented power (for good or evil) into the hands of leaders inclined to the latter alternative. Wallace summarized his argument: "A comparative estimate of the number and importance of these [technological] achievements leads to the conclusion that not only is our century superior to any that have gone before it, but that it may be best compared with the whole preceding historical period. It must therefore be held to constitute the beginning of a new era in human progress. But this is only one side of the shield. Along with these marvelous Successes--perhaps in consequence of them--there have been equally striking Failures, some intellectual, but for the most part moral and social. No impartial appreciation of the century can omit a reference to them; and it is not improbable that, to the historian of the future, they will be considered to be its most striking characteristic." As the 20th century (and an entire millennium) draws to its close, we can only reaffirm Wallace's hopes and fears with increased intensity--for our century has witnessed even greater changes, with special acceleration provided in recent years by two great revolutions--in genetic understanding and the electronic technology of information processing. Our century has also, however, experienced the depths of two world wars, with their signatures of senseless death in the trenches of Belgium and France, in the Holocaust, and at Hiroshima. How dizzyingly fast we move, yet how stuck we remain. History will not remember the following items as particularly memorable or defining features of 1998, but two pairs of remarkably similar films, released by two rival companies, epitomize Wallace's paradox as applied to our time. The summer of 1998 featured two disaster movies, one about a comet, the other about an asteroid, on track to strike and destroy the Earth and how courageous heroes divert the menace with nuclear weapons, thus saving our planet: Deep Impact by DreamWorks and Armageddon by Disney. A few months later the same companies fought another round by releasing, nearly simultaneously, moral fables about insects (standing in for human values, of course) done entirely by computer animation: Antz and A Bug's Life, respectively. Consider the dizzying spiral of upward scientific and technological advance illustrated in these pairings. The intellectual basis for these disaster films--the theory that an extraterrestrial impact triggered the catastrophic mass extinctions that wiped out dinosaurs (along with half the species of marine organisms) 65 million years ago and gave mammals their lucky and golden opportunity--was first proposed (and dismissed as fanciful nonsense by most of my paleontological colleagues) in 1980. Late in 1998 a published report that a tiny fragment of the impacting asteroid had been recovered from strata deposited at the time of the hypothesized blast pretty much sealed the continually improving case for this revolutionary scenario. Few hypotheses that begin in such controversy can progress to accepted fact in a mere 20 years. Even fewer ideas ever pass from the professional world of science into hot themes for mass markets of our commercial culture. (The popular resonances are not hard to identify in this case: if extraterrestrial impact caused mass extinctions millions of years ago, why not again? And why not use our nuclear weapons, heretofore imbued with no conceivable positive utility in saving life, to fend off such a cosmic threat?) Consider also the equally accelerating spiral of technological advance illustrated by the manufacture of these films--60 years from Disney's first animated full-length feature, Snow White and the Seven Dwarfs, where each frame had to be drawn and painted by hand, to orders of magnitude more complexity based on orders of magnitude less handwork, as computers interpolate smooth action between end points of human design in A Bug's Life. And yet, to invoke the other side of Wallace's paradox, these films, for all their technical wizardry, remain mired in the same conventions, prejudices, and expectations that keep our social relations (and moral perceptions) so far behind our material accomplishments. Both Antz and A Bug's Life feature young male heroes who are reviled and misunderstood by a conformist multitude but who eventually save their colonies by their individualistic ingenuity--and, of course, then win the (anthropomorphic) hand of the young queen. But true ant societies are matriarchies. Males are rare and effectively useless, and all the so-called workers and soldiers (including the prototypes for the two male heroes of the recent movies) are sterile females. In A Bug's Life the worthy ants have four limbs and look human; only the villainous grasshoppers have--as all insects truly do--six legs (and a resulting sinister appearance in their two pairs of arms, at least to human observers; good guys must look like us). The transitions between centuries and millennia fall at precise, but entirely arbitrary, boundaries of human construction. No astronomical or biological cycle works at a repeat frequency of exact tens, hundreds, or thousands. Yet we imbue these purely conventional boundaries with our own decreed meaning, and parse time into decades (the Gay Nineties, the Roaring Twenties, the Complacent Fifties). We have even coined a phrase to mark our anxiety and stocktaking at major boundaries--the fin de sicle (or "end of century") phenomenon. We are now about to face, for the first time in the history of most nations and traceable family lines, the largest of all human calendric boundaries in a millennial transition. And who can possibly predict what the first years of the new millennium will bring? Wallace's paradox--the exponential growth of technology matched by the stagnation of morality--implies only more potential for instability and less capacity for reasonable prognostication. But at least we might find some solace in the sharply decreasing majesty of our fear. At the last millennial transition of year 1000, many European Christians awaited (either with fear or ecstasy) the full apocalyptic force of Christ's Second Coming to initiate his thousand-year reign of Earthly bliss. At the turning in 2000, we focus most dread upon the consequences of a technological glitch that may make our computers read a two-digit year code as 1900 rather than 2000. Tragic Optimism Human rationality, that oddest of all unique evolutionary inventions, does confer some advantages upon us. This most distinctively human trait does grant us the capacity to analyze the sources of current difficulty and to devise (when possible) workable solutions for their benign resolution. Unfortunately, as another expression of Wallace's paradox, other all-too-human traits of selfishness, sloth, lack of imagination, fear of innovation, moral venality, and old-fashioned prejudice often conspire to overwhelm rationality and to preclude a genuine resolution that good sense, combined with good will, could readily implement under more favourable circumstances. The lessons of history offer no guarantees but only illustrate the full range of potential outcomes. Occasionally, we have actually managed to band together and reach genuine solutions. Smallpox, once the greatest medical scourge of human civilization, has been completely eradicated throughout the world, thanks to coordinated efforts of advanced research in industrialized countries combined with laborious and effective public health practices in the developing world. On a smaller but still quite joyous note, during 1998 the bald eagle reached a sufficient level of recovery--thanks to substantial work by natural historians, amateur wildlife enthusiasts, and effective governmental programs--to become the first item ever deleted for positive reasons from the American Endangered Species List. Just as often, unfortunately, we have failed because human frailty or social circumstances precluded the application of workable solutions. (Cities become buried by volcanoes viewed as extinct only because they haven't erupted in fallible memory. Houses built on floodplains get swept away because people do not understand the nature of probability and suppose that, if the last "hundred-year flood" occurred in 1990, the next deluge of such intensity cannot happen until 2090--thus tragically failing to recognize the difference between a long-term average and a singular event. In 1998 did India or Pakistan do anything but increase their expenditures, decrease their world respect, and endanger their countrymen by matching atomic tests, with both nations remaining at exactly the same balance after their joint escalation?) I do, however, think that one pattern--the phenomenon that engenders what I have called "tragic optimism" in setting a title for this essay--does emerge as our most common response, and therefore as the potential outcome that should usually attract our betting money in the lottery of human affairs. We do usually manage to muddle through, thanks to rationality spiced with an adequate dose of basic human decency. This capacity marks the "optimism" of my signature phrase. But we do not make our move toward a solution until a good measure of preventable tragedy has already occurred to spur us into action--the "tragic" component of my designation. To cite an example from the hit movie of 1998--James Cameron's gloriously faithful (and expensive) re-creation of the greatest maritime disaster in our civil history--we do not equip ships with enough lifeboats until the unsinkable Titanic founders and drowns a thousand people who could have been saved. We do not develop the transportation networks to distribute available food, and we do not overcome the social barriers of xenophobia until thousands have died needlessly by starvation. (As pointed out by Amartya Sen, winner of the 1998 Nobel Memorial Prize in Economic Science, no modern famine has ever been caused by a genuine absence of food; people die because adequate nourishment, available elsewhere, cannot reach them in time, if at all.) We do not learn the ultimate wisdom behind Benjamin Franklin's dictum that we must either hang together or hang separately, and we do not choose to see, or to vent our outrage in distant lands beyond our immediately personal concerns, until the sheer horror of millions of dead Jews in Europe or Tutsi in Africa finally presses upon our consciousness and belatedly awakens our dormant sense of human brotherhood. To cite a remarkable example from 1998 on the successes of tragic optimism, many people seem unaware of the enormously heartening, worldwide good news about human population growth--a remarkable change forged by effective research, extensive provision of information, debate, and political lobbying throughout the planet, and enormous effort at local levels of village clinics and individual persuasion in almost all nations--mostly aimed at the previously neglected constituency of poor women who may wish to control the sizes of their families but had heretofore lacked access to information or medical assistance. In the developing countries of Africa, Latin America, and Asia--the primary sources of our previous fears about uncontrollable population explosions that would plunge the world into permanent famine and divert all remaining natural environments to human agricultural or urban usages--the mean number of births per woman has already been halved from a previous average of about six to a figure close to three for the millennial transition. In most industrialized nations birthrates have already dropped below replacement levels to fewer than two per woman. But, as the dictates of tragic optimism suggest, we started too late once again. Today's human population stands at about 5.9 billion, arguably too high already for maximal human and planetary health. Moreover, before stabilization finally gains the upper hand, the momentum of current expansion should bring global levels to about 10.4 billion by 2100. Most of this increase will occur in maximally stressed nations of the developing world. We have probably turned the tide and gained the potential for extended (and even prosperous) existence on a stable planet, but we dithered and procrastinated far too long and must bear the burden of considerable, and once preventable, suffering (and danger) as a result. Whither Europe's Monarchies? by Vernon Bogdanor Before World War I every nation in Europe except France, Portugal, and Switzerland was a monarchy. In 1998, by contrast, only eight monarchies remained, if very small states such as Liechtenstein and Monaco were excluded. The eight were Belgium, Denmark, Luxembourg, The Netherlands, Norway, Spain, Sweden, and the United Kingdom. The British monarchy differed from the others in that the queen was an international monarch, the sovereign not only of Britain but also of 15 Commonwealth countries outside Europe. There are two reasons for the demise of monarchy in much of Europe. The first is defeat in war. The collapse of Austria-Hungary, Germany, and Russia during World War I resulted in the end of the Habsburg, Hohenzollern, and Romanov dynasties, respectively. In World War II collaboration with fascism ended the Italian monarchy, and after the war the new communist regimes rapidly removed the sovereigns of Bulgaria and Romania. The second reason is the process of democratic change. Monarchies have survived in Europe only where they have accommodated themselves to democracy rather than resisting it. In Europe monarchy exists in limited, constitutional form. A democracy whose head of state is a sovereign links two conflicting, some would say contradictory, notions. A democracy is a form of government in which political authority derives from popular election. A sovereign, by contrast, does not rule because of election but instead by inheritance and for life. This conflict has been resolved by means of the concept of a constitutional monarchy, one that operates in accord with constitutional rules. These rules, which may be either legal or nonlegal, limit the power of the sovereign and ensure that he or she acts in accordance with democratic norms. Monarchs in modern Europe have three major kinds of functions. Constitutional functions include appointing a prime minister and dissolving the parliament. In Britain these duties are generally of a formal nature. Because Britain's electoral system generally yields a clear winner, there is rarely any dispute as to who ought to be appointed prime minister after a general election. Since 1918 there have been only three occasions when that has not been the case--the general elections of 1923, 1929, and February 1974. In the other European monarchies, however, elections are held under various systems of proportional representation. These rarely yield clear majorities for a single party. Coalition or minority government thus becomes the norm. This gives the sovereign some leeway because it may be unclear as to who ought to be appointed prime minister after a general election. It may also be unclear as to whether a prime minister is entitled to a dissolution of the parliament, for there might well be an alternative majority within the legislature capable of sustaining an alternative government. In the Scandinavian monarchies the sovereign generally plays a comparatively passive constitutional role. In Belgium, The Netherlands, and Luxembourg, however, the sovereign tends to be more active. Sometimes, indeed, the views of the sovereign in those countries influence the outcome of the government-formation process. This influence is generally used to secure national unity. In both Belgium and The Netherlands, the sovereign has played a unifying role in societies divided by language and religion, respectively. The second set of functions undertaken by the sovereign is ceremonial in kind. Sovereigns carry out a wide range of public engagements and duties. These engagements--once dismissed by French Pres. Charles de Gaulle as opening exhibitions of chrysanthemums--are means by which the sovereign can be seen as fulfilling the third and most important function, that of representation. In the modern world the central role of a constitutional monarch is a symbolic or representational one, in which he or she represents and symbolizes not just the government but the nation. The ceremonial functions are an important means of fulfilling this function, since, to be an effective symbol, a sovereign must be seen. The great advantage of monarchy is that it is a system in which the head of state is free from party ties. It is generally easier for a hereditary monarch to represent the nation than it is for a president, who will often be a politician, chosen through an election that might have been divisive. In 1998 the European constitutional monarchies seemed securely based, and, although there were republican movements in a number of European countries, they appeared to pose no real threat to the sovereigns. Nevertheless, monarchs faced new challenges in the modern world, a world in which deference and respect for authority were in decline and in which institutions were required to justify themselves in utilitarian terms. In Britain the archbishop of Canterbury declared after the coronation of Elizabeth II in 1953 that the nation had been close to God on that occasion; in 1956 an opinion survey showed that 34% of the population believed that the queen had actually been chosen by God. Sentiments of that kind are now rare, and the European monarchies have been compelled to modernize themselves, becoming more open and more involved in welfare and charitable activities. The mystical monarchy has thus become transmuted into the welfare monarchy. Constitutional monarchy survives in a small number of nations in Western Europe, where it depends on popular support. In 1969 the duke of Edinburgh, husband of Queen Elizabeth II, declared that "it is a complete misconception to imagine that monarchy exists in the interests of the monarch. It doesn't. It exists in the interests of the people." In the contemporary world monarchy has become dependent upon the people, and yet at its best it serves not to limit the democratic principle of rule by the people but to underpin and sustain it. Vernon Bogdanor is a professor of government at the University of Oxford. Wildlife Conservation Orangutans (Pongo pygmaeus) were among the species that suffered the loss of habitat and death at the hands of humans as they fled the fires in Borneo (Kalimantan), Sumatra, and other parts of Indonesia in 1998. More than 30,000 sq km (11,580 sq mi) burned between January and May. Almost all of Kutai National Park was destroyed, as was the Wein River Orangutan Sanctuary. In February the Truong Son muntjac deer from central Vietnam was described and named Muntiacus truongsonensis on the basis of 17 skulls and two tails obtained from hunters. In June the description of a new species of marmoset (Callithrix humilis) in Brazil was published. The marmoset, which did not appear to be endangered, had a known distribution covering some 250-300 sq km (95-115 sq mi), by far the smallest for any Amazonian primate. Another new species described in 1998 was a bird (Scytalopus iraiensis) found in an area that was to be flooded by a dam in Brazil. Work on the dam was suspended as a result of the discovery. The cherry-throated tanager (Nemosia rourei) was rediscovered in Brazil in February, 47 years after the last sighting. Two other bird rediscoveries were reported in March; the forest owlet (Athene blewitti), not recorded since 1884, was found in India, and a population of the critically endangered bearded wood partridge (Dendrortyx barbatus) was discovered in Mexico, where the species was last seen in 1986. A report published in March urged the protection of sharks and other elasmobranch fishes in North American waters. Shark fins had become one of the most valuable fisheries products in the world, and shark cartilage was also used in the growing Western health-food market. In August it was reported that not long after a commercial trawl fishery for rays started in the northwestern Atlantic Ocean, the largest species, the barn door skate (Raja laevis), was nearly gone. In the U.S. 27 leading chefs took North Atlantic swordfish (Xiphias gladius) off their menus in response to the finding that the fishery had crashed. According to the Red List of Threatened Plants, published by the International Union of Conservation of Nature and Natural Resources in April, 12.5% of the world's plant species were threatened with extinction. The list of 33,798 species included 380 that were extinct in the wild and 371 that might be extinct. Of the species listed, 91% were endemic to a single country. Another report stated that many wild plants and animals used in medicine were becoming scarce in East Africa and southern African countries; it identified 102 plant species and 29 animal species as priorities for conservation action, including the African rock python and the baobab tree ( Adansonia digitata). Almost 9,000 of the world's tree species were threatened, according to research results published in September. At least 77 species were extinct, 8,753 were critically endangered, and 1,319 were endangered. The 22nd meeting of the Parties to the Antarctic Treaty, held in Norway on May 25-June 6, failed to address the severe problem of illegal fishing for Patagonian toothfish (Dissostichus eleginoides) in the Southern Ocean. Illegal fishing was taking about 100,000 tons, compared with the 18,000 tons caught by the legal fishery, and the fish could soon become commercially extinct. The fishery also killed 5,000-154,000 seabirds annually, including threatened petrels and albatrosses. Invasions by alien species, already a serious threat to biodiversity, were expected to worsen in the future as the world warmed, according to an international workshop held in San Mateo, Calif., in April. It was believed that the tropical alga (Caulerpa taxifolia) that invaded the Mediterranean Sea in the mid-1980s could move up the Atlantic coast of Europe if ocean temperatures rose. In Tonga an introduced species of long-legged ants (Anoplolepis gracilipes) killed hatchlings of the native Tongan incubator birds, and the little red fire ant (Wasmannia auropunctata) had invaded New Caledonia and the Solomon Islands, where it attacked native vertebrates and caused the loss of native invertebrates that had key functions in the natural community. The problem of marine-invading species in Australia was being tackled by a pilot community-monitoring program aimed at the early detection of new invasive species and the development of knowledge about introduced species already present. By 1998 more than 150 introduced species had been discovered in Australian waters, of which eight were considered pests. In the Hawaiian Islands there were once 750 species of native land snails, more than 99% of them endemic. Most had become extinct or severely threatened, largely owing to the introduction of predatory carnivorous snails. The 1998 edition of the UN List of Protected Areas revealed a global network of more than 30,000 protected areas covering a total of 13.2 million sq km (5.1 million sq mi) designated under national legislation to conserve nature and associated cultural resources. One of the world's largest and most undisturbed tropical forests was permanently protected in June when Suriname created a 16,200-sq km (6,250-sq mi) reserve, covering some 10% of the country's land area. An infectious agent was suspected in a mass mortality of Adelie penguin (Pygoscelis adeliae) chicks in Antarctica. Antibodies of the avian pathogen infectious bursal disease virus had been found in penguins from colonies near human activity. A possible source of the virus was humans' careless disposal of poultry products or contaminated clothing or vehicles. In January-February, 1,345 New Zealand sea lion (Phocarctos hookeri) pups and 85 adults died from septacaemia. Biopsies of the sea lions, which lived only in the Auckland Islands, revealed salmonella and a second, unidentified bacterium. A new fungal disease was shown to be the cause of death in amphibians found dead at pristine rain-forest sites in Australia and Central America. The fungus, found in the keratinized cells of the skin of adult amphibians, appeared to be the same pathogen on both continents and probably caused death by interfering with supplementary water uptake or respiration through the skin. The disease was identified as the cause of death of frogs and toads belonging to nine genera, including Taudactylus acutirostris, an Australian species that might have become extinct. In June conservationists celebrated the fact that 10 mountain gorillas had been born in the Virunga National Park in the Democratic Republic of the Congo since the onset of civil unrest 18 months previously, but in September two mountain gorillas were killed by poachers in the park. In 1998 there were only about 600 mountain gorillas left. JACQUI M. MORRIS World Affairs The global recession that spread in 1998, the deepest since World War II in parts of the world, was bound to have far-ranging political consequences. All countries were affected by it, but some, by necessity, more than others. Most severely hit were the underfunded Asian economies and, as a result of declining commodity prices, countries heavily dependent on the export of raw materials such as oil. At the same time, it became abundantly clear that the international financial system that had served the world economy well since 1945 was in urgent need of reform. Consequently, international economics rather then political issues took pride of place in the deliberations of world leaders of the developed as well as the less-developed countries during the year under review. The immediate crisis began in Thailand in July 1997 and subsequently spread to South Korea, one of the powerhouses of the world economy; the South Korean crisis necessitated the largest intervention ever made by the World Bank. By early 1998 Malaysia was affected; its government placed the responsibility on international currency speculators. At the same time, also in the wake of an economic crisis, student unrest in Indonesia led to widespread riots, often against the ethnic Chinese, and eventually, in late May, to the resignation of President Suharto (see BIOGRAPHIES), who had ruled the country for more than three decades. Finally, during the spring and summer of 1998, Japan, the leading force in Asia and the second largest economy in the world, had to face the full impact of the crisis, which also led to a change of government. The reasons for the economic meltdown were somewhat different in each case. In Japan the banking system was heavily indebted, and the government's policy of dealing with the situation by cutting taxes and increasing spending to stimulate economic growth had only a limited effect. Elsewhere, the immediate problem was the overvaluation of the country's assets, which tempted speculators but eventually led to a sharp decline in the value of the national currencies. In most cases, however, these were not just currency crises. The basic reasons went deeper. They were structural in character and included weak banking systems; "crony capitalism," in which funds were diverted to those favoured by government leaders rather than to those most capable of using them; bad loans; investments in unproductive enterprises; and, in some countries, ineffective tax collection. These weaknesses helped to create a situation that made the economies of those nations particularly vulnerable to sudden movements of global capital such as the withdrawal of investments. Much thought was given to finding a way out of the crisis--in the short run by establishing rescue funds and stabilizing the currencies and the stock exchanges of the countries most affected. At meetings of heads of government in Singapore, Washington, Moscow, and elsewhere, it became clear that the international organizations established after World War II (the World Bank and the International Monetary Fund) did not have sufficient means to cope with this assignment and that new initiatives were needed. What was originally known as the "Asian currency crisis" had ripple effects and by late 1998 had generated a general crisis of confidence that manifested itself notably on the world's stock markets. Among the Latin-American countries, Venezuela, Colombia, and, above all, Brazil were affected by the free-market upheaval. The opening up of the Brazilian market had brought that country $45 billion in investments in 1997, but it also caused high interest rates and lost jobs, a high social cost for large sections of the population. At the year's end the challenges facing Brazil to prevent a collapse were daunting. (See Spotlight: The Troubled World Economy.) The political consequences of the economic crisis were only too clear in the case of Russia. The political and social equilibrium of the country had been tenuous for years, and the immediate crisis came to a head in March when Pres. Boris Yeltsin suddenly dismissed Prime Minister Viktor Chernomyrdin because (according to the official version) his government had been lacking in "initiative and dynamism." He was replaced by a young and not-very-well-known technocrat, Sergey Kiriyenko, whose government lasted only five months. The pressures on the ruble, reflecting the weakness of the economy, resulted in a disastrous fall in the value of the currency. Massive tax evasion also continued, and the government found itself unable to service the massive loans it had received or, worse yet, pay its employees. Yeltsin, whose hold grew weaker as his health deteriorated, wanted Chernomyrdin back, but the legislature refused to give its approval, and as a compromise Yevgeny Primakov was appointed prime minister. The Russian crisis caused alarm in the West, both in view of its potential political repercussions and because there seemed to be no obvious solution. Pouring more money into the Russian economy would not be a long-term solution, but allowing the country to slide into chaos was equally unacceptable. The economic crisis mostly overshadowed political and military tensions. One exception was the explosion of five nuclear devices by the new nationalist government in India (and six more by Pakistan two weeks later). These events confirmed that nuclear proliferation was running its inexorable course. International investigations did not establish that Iraq and Iran had been building nuclear weapons, but the investigations did conclude that those countries had acquired long-distance missiles as well as nonconventional weapons. U.S. attempts to continue UN inspections of possible production of weapons of mass destruction in Iraq were successfully barred by the Iraqi government. Major civil wars in Africa (Democratic Republic of the Congo and Sierra Leone) continued their course. The innovation in these wars was the intervention in the conflicts by neighbouring countries such as Nigeria, Zimbabwe, Angola, and South Africa. In addition, there were armed conflicts in Lesotho, The Sudan, and Burundi, and for a short time full-scale war broke out between Eritrea and Ethiopia, which had formerly been close allies. In Europe the trend toward political success for left-of-centre parties continued with the victory in Germany of the Social Democrats over the Christian Democrats, who had been in power for 16 years. In Italy the left-of-centre government headed by Romano Prodi collapsed, but his successor was the former communist Massimo D'Alema. The military confrontations in former Yugoslavia continued, but the focus switched from Bosnia and Herzegovina to Kosovo, where the Serbian government at the end of February mounted a military offensive against Kosovo's ethnic Albanian majority, who were seeking greater autonomy. Repression and fighting increased during the subsequent months as a separatist Albanian army took form, and it was only owing to severe pressure and the issuing of a NATO ultimatum that the Yugoslav government in Belgrade showed willingness to negotiate with the ethnic Albanians. American peace initiatives were successful in Northern Ireland. A majority of the Irish electorate voted for new constitutional arrangements that seemed to satisfy the minimum demands of the two warring sides in a conflict that had lasted for centuries. American peacemaking faced greater difficulties in its attempts to bring Israelis and Palestinians together. The accords achieved in a conference in Maryland in October helped to give fresh momentum to a peace process that had run out of steam. Given the widely differing long-term aspirations of the two sides, however, it was difficult to regard this as more than a stopgap measure in a conflict likely to endure for a long time. WALTER LAQUEUR World Cup France easily won the 16th World Cup, beating Brazil 3-0 in the final at Saint-Denis, near Paris, on July 12, 1998. The tournament, which had 32 finalists for the first time, was largely disappointing, the overall standard of play being generally of a low-key nature. Although the Fdration Internationale de Football Association (FIFA) called France 98 a successful World Cup, most independent critics rated it as a tournament of quantity rather than quality. The delay of almost a week between teams' first and second matches unnecessarily prolonged the competition in its initial stage in sharp contrast to the knockout phase in the second round, when the first two matches provided the winners with a respite of another six days compared with only four days rest for those playing on the last day of the round. Statistics revealed that 1,881 shots were taken in the 64 matches, 891 of them on target from which 171 goals were scored. There were 667 corners and 379 offside decisions, as well as 2,135 offenses of one kind or another. Considering that 64 matches were contested, it was not surprising that a record number of yellow (250) and red (22) cards were shown to players. There was an alarming increase in the unlawful use of hands and arms by players in tackling opponents and in attempting to gain unfair advantage at set-pieces. One of the French trio shown red cards was Zinedine Zidane (see BIOGRAPHIES). After being suspended for two games, Zidane returned as the saviour of France in the final, in which he scored the first two goals with rare, headed corner shots--the first in the 27th minute and the second within seconds of the halftime interval. In the dying seconds of the match, Emmanuel Petit added a third goal for France. It was the 1,000th in the country's football history and the 1,755th overall in World Cup finals. The real drama of the final match had occurred before the kickoff, when Brazil's Ronaldo, the FIFA Player of the Year, was rushed to the hospital for tests following a night in which he had suffered a seizure. He was named a late addition to the Brazilian team, but he was clearly not in either the right physical or mental condition for playing in a match of this magnitude. The episode seemed to affect the entire Brazilian team, which gave one of its poorest displays in a final tournament. In a competition devoid of memorable individual accomplishment, outstanding scoring contributions were made by David Beckham with a free kick for England against Colombia, Michael Owen for a breathtaking solo effort for England against Argentina, and, in the finest effort of all, Dennis Bergkamp of The Netherlands. Against Argentina, Bergkamp controlled a lofted 46-m (50-yd) pass from Frank de Boer with one touch, beat his marker with the second, and finished clinically with the third to score the winning goal in the 89th minute. Bergkamp, however, had been fortunate to avoid a red card in the match against Yugoslavia, and Beckham was dismissed for a moment of stupidity against Argentina. Croatia was the surprise team, deservedly finishing in third place, and Croatia's Davor Suker was the tournament's leading scorer with six goals. Fan violence was chiefly restricted to England's followers, though the German fans were involved in some unsavoury incidents. Lothar Matthaus of Germany set a World Cup record by appearing in his 25th match in a final tournament, increasing his overall total of games for his country to 129. Within two months of the final match, 22 of the 32 national coaches involved in the tournament had either been fired or had resigned. JACK ROLLIN

Britannica English vocabulary.      Английский словарь Британика.