Hi Murray,
You wrote to David Noon:
ML> Nowadays, the Average Idiot Home User hasn't heard of any character
ML>code other than ASCII, if even that :-(. As you well know, the stupid
ML>collating sequence of ASCII is due to the great desire of AT&T to pack
ML>everything they thought anybody would ever want into seven bits, while
ML>letting the presence of a single bit differentiate between lower- and
ML>upper-case alpha characters (a desirable characteristic only for fully
ML>mechanical terminals). Making sense (from a computer-architectural
ML>standpoint) has never been a requirement for standards committees! (I
ML>have a great dislike, based on many years of past participation, of most
ML>computer-related standardizing activities. The standards committee
ML>members seem to be more interested in showing how smart they are than in
ML>following Euripides' "legacy" law of computer architecture: "The gods
ML>visit the sins of the fathers upon the children.")
I think of the ANSI screen control sequences as a classic example of
that "cleverness", even though they are really DEC terminal control
sequences.
ML> As to why the PC architecture included the use of ASCII as an
ML>internal character code: since I had nothing to do with the IBM PC
ML>development, I have only some conjectures. The PC was developed by a
ML>semi-independent group in Florida. The then upper management of IBM
ML>didn't believe it was ever going to go anywhere, so probably didn't care
Original estimate was 250,000 units over 5 years according to an
article in Byte in 1990. It should have been dead, buried, and history
by now...
ML>that the original perpetrators were using their 8-bit version of ASCII
ML>instead of EBCDIC! The character-code choice may have had something to
ML>do with the fact that the 7-bit version of ASCII (embedded in eight
ML>bits) was being used in most desktop machines up to that date. (My CP/M
ML>machine, vintage 1979, used a so-called ASCII terminal that had no way
ML>to input the high-order eighth bit.) Only some word processors of the
ML>time used that bit for anything.
Compatibility with the CP/M-80 machines they were designed to replace is
probably the only reason (and the fact that Boca Raton was far away from
the "Big Iron" development sites).
ML> The 8088/86 instruction set was derived from the 8080 set, and two
ML>of the three operating systems originally offered for the IBM PC were
ML>(to some extent) "upward compatible" from CP/M. IMO, the PC fathers did
ML>the best that they could to make ASCII barely usable, supplying both the
ML>missing "upper" 128 characters and also text-mode graphic symbols for
ML>the otherwise useless (by then) ASCII control characters. We OS/2
ML>children are still living with the results of their sins.
You thinking of the same three as me? PCDOS, CP/M-86, and UCSD P code.
Not all the control characters were useless (especially as CR, LF, FF
and TAB are officially control characters!!!!), I have been known to use
DC1 and DC3 on a serial link - and the fact that the PC supports them
internally has been known to cause users problems...
George
___
X SLMR 2.1a X Study the past, if you would divine the future.
--- Maximus/2 3.01
292/854
* Origin: Air Applewood, OS/2 Gateway to Essex 44-1279-792300 (2:257/609)
|