ARTIFICIAL INTELLIGENCE


Meaning of ARTIFICIAL INTELLIGENCE in English

(AI), the capacity of a digital computer or computer-controlled robot device to perform tasks commonly associated with the higher intellectual processes characteristic of humans, such as the ability to reason, discover meaning, generalize, or learn from past experience. The term is also frequently applied to that branch of computer science concerned with the development of systems endowed with such capabilities. Research on artificial intelligence began soon after the development of the modern digital computer in the 1940s. Early investigators quickly recognized the potential of computing devices as a means of automating thought processes. Over the years, it has been demonstrated that computers can be programmed to carry out very complex tasksas, for example, discovering proofs for theorems or playing chesswith great proficiency. Some computer programs that are used to perform AI tasks are designed to manipulate symbolic information at extremely high speeds, in order to compensate for their partial lack of human knowledge and selectivity. Such programs are usually called expert systems. Other programs are designed to simulate human capabilities for problem solving through the use of highly selective search and recognition methods, rather than through superhuman processing speeds. Both expert systems and programs simulating human methods have attained the performance levels of human experts and professionals in performing certain specific tasks, but by the mid-1990s there were still no programs that could match human flexibility over wider domains or in tasks requiring much everyday knowledge. Knowledge-based expert systems enable computers to make decisions for solving complicated nonnumerical problems. These expert systems consist of hundreds or thousands of if-then logic rules formulated with knowledge gleaned from leading authorities in a given field. The MYCIN program, for example, has been used to help physicians diagnose certain forms of bacterial blood infections and to determine suitable treatments. A computer programmed in MYCIN first makes a plausible guess as to the patient's condition on the basis of observed symptoms, then determines how well that tentative diagnosis fits all known facts about the behaviour of the microorganism thought to be involved. Once the computer has identified the cause of the infection, it reviews the kinds of antibiotics available and recommends one or several alternative forms of therapy. Programs have also been developed that enable computers to comprehend commands in a natural languagee.g., ordinary English. The software systems of this type that have been produced so far are limited in their vocabulary and knowledge to specific, narrowly defined subject areas. They contain large amounts of information about the meaning of words pertaining to that subject, as well as information about grammatical rules and common violations of those rules. The ability to identify graphic patterns or images is associated with artificial intelligence, since it involves both cognition and abstraction. In a system designed with this capability, a device linked to a computer scans, senses, and transforms images into digital patterns, which in turn are compared with patterns stored in the computer's memory. The stored patterns can represent geometric shapes and forms that the computer has been programmed to (or has learned to) identify. The computer processes the incoming patterns in rapid succession, isolating relevant features, filtering out unwanted signals, and adding to its memory any new patterns that deviate beyond a specified threshold from the old and are thus perceived as new entities. Major and continuing advances in computer processing speeds and memory sizes have facilitated the development of AI programs. Although most AI programs attempting to simulate higher mental functions incorporate the bottleneck of limited short-term memory, which restricts humans to carrying out one or a few mental tasks at a time, many investigators have begun to explore how the intelligence of computer programs can be enhanced by incorporating parallel processingi.e., the simultaneous execution of several separate operations by means of computer memories that allow many processes to be carried out at once. The question of which portions of the human brain (and their corresponding thought processes) operate serially and which operate in parallel has been a topic of intense debate by researchers in both the cognitive sciences and AI, but no clear verdict had been reached by the mid-1990s. The largest computer memories now contain elementary circuits that are comparable in number to the synaptic connections (about 10 trillion) in the human brain, and they operate at speeds (billions of operations per second) that are far faster than elementary neural speeds (which are at most thousands of operations per second). The challenge driving AI research is to understand how computers' capabilities must be organized in order to reproduce the many kinds of mental activity that are comprised by the term thinking. AI research has thus focused on understanding the mechanisms involved in human mental tasks and on designing software that performs similarly, starting with relatively simple ones and continually progressing to levels of greater complexity. Herbert A. Simon

Britannica English vocabulary.      Английский словарь Британика.