Meaning of VAXOCENTRISM in English
transcription, транскрипция: [ /vak`soh-sen'trizm/ n. ]
[analogy with `ethnocentrism'] A notional disease said to afflict C programmers who persist in coding according to certain assumptions that are valid (esp. under Unix) on VAXen but false elsewhere. Among these are:The assumption that dereferencing a null pointer is safe because itis all bits 0, and location 0 is readable and 0. Problem: this mayinstead cause an illegal-address trap on non-VAXen, and even onVAXen under OSes other than BSD Unix. Usually this is an implicitassumption of sloppy code (forgetting to check the pointer beforeusing it), rather than deliberate exploitation of amisfeature.The assumption that characters are signed.The assumption that a pointer to any one type can freely be castinto a pointer to any other type. A stronger form of this is theassumption that all pointers are the same size and format, whichmeans you don't have to worry about getting the casts or typescorrect in calls. Problem: this fails on word-oriented machinesor others with multiple pointer formats.The assumption that the parameters of a routine are stored inmemory, on a stack, contiguously, and in strictly ascending ordescending order. Problem: this fails on many RISC architectures.The assumption that pointer and integer types are the same size,and that pointers can be stuffed into integer variables (andvice-versa) and drawn back out without being truncated or mangled. Problem: this fails on segmented architectures or word-orientedmachines with funny pointer formats.The assumption that a data type of any size may begin at any byteaddress in memory (for example, that you can freely construct anddereference a pointer to a word- or greater-sized object at an oddchar address). Problem: this fails on many (esp. RISC)architectures better optimized for HLL execution speed, andcan cause an illegal address fault or bus error.The (related) assumption that there is no padding at the end oftypes and that in an array you can thus step right from the lastbyte of a previous component to the first byte of the next one. This is not only machine- but compiler-dependent.The assumption that memory address space is globally flat and thatthe array reference foo[-1] is necessarily valid. Problem:this fails at 0, or other places on segment-addressed machines likeIntel chips (yes, segmentation is universally considered a brain-damaged way to design machines (see moby ), but thatis a separate issue).The assumption that objects can be arbitrarily large with nospecial considerations. Problem: this fails on segmentedarchitectures and under non-virtual-addressing environments.The assumption that the stack can be as large as memory. Problem:this fails on segmented architectures or almost anything elsewithout virtual addressing and a paged stack.The assumption that bits and addressable units within an objectare ordered in the same way and that this order is a constant ofnature. Problem: this fails on big-endian machines.The assumption that it is meaningful to compare pointers todifferent objects not located within the same array, or to objectsof different types. Problem: the former fails on segmentedarchitectures, the latter on word-oriented machines or others withmultiple pointer formats.The assumption that an int is 32 bits, or (nearlyequivalently) the assumption that sizeof(int) ==sizeof(long). Problem: this fails on PDP-11s, 286-based systems andeven on 386 and 68000 systems under some compilers (and on 64-bit systems like the Alpha, of course).The assumption that argv is writable.
Obriged! For full description refer to http://www.tuxedo.org/~esr/jargon/
Jargon File English vocabulary. Английский словарь жаргона. 2012