-=> Quoting George White to Murray Lesser <=-
ML> Nowadays, the Average Idiot Home User hasn't heard of any character
ML>code other than ASCII, if even that :-(. As you well know, the stupid
ML>collating sequence of ASCII is due to the great desire of AT&T to pack
ML>everything they thought anybody would ever want into seven bits, while
ML>letting the presence of a single bit differentiate between lower- and
ML>upper-case alpha characters (a desirable characteristic only for fully
ML>mechanical terminals).
ASCII *was* defined in 1963 (or was it 68?) you know. It was originally
intended as a standard for moving data between different brands of
mainframes (which all had their *own* character sets back then)
Also, that uppercase/lower case distinction being on bit was important
for programmers writing *tight* code. I used to use that single bit
trick in programs back when 16k or DRAM cost several hundred dollars.
ML>Making sense (from a computer-architectural
ML>standpoint) has never been a requirement for standards committees! (I
ML>have a great dislike, based on many years of past participation, of most
ML>computer-related standardizing activities. The standards committee
ML>members seem to be more interested in showing how smart they are than in
ML>following Euripides' "legacy" law of computer architecture: "The gods
ML>visit the sins of the fathers upon the children.")
GW> I think of the ANSI screen control sequences as a classic example of
GW> that "cleverness", even though they are really DEC terminal control
GW> sequences.
They are derived from DEC control sequences for the VT-52. The ESC was
turned into ESC[ so that DEC wouldn't have an unfair advantage in the
market. The VT-100 came *after* the X3.64 standard was defined.
ML> As to why the PC architecture included the use of ASCII as an
ML>internal character code: since I had nothing to do with the IBM PC
ML>development, I have only some conjectures. The PC was developed by a
ML>semi-independent group in Florida. The then upper management of IBM
ML>didn't believe it was ever going to go anywhere, so probably didn't care
GW> Original estimate was 250,000 units over 5 years according to an
GW> article in Byte in 1990. It should have been dead, buried, and history
GW> by now...
ML>that the original perpetrators were using their 8-bit version of ASCII
ML>instead of EBCDIC! The character-code choice may have had something to
ML>do with the fact that the 7-bit version of ASCII (embedded in eight
ML>bits) was being used in most desktop machines up to that date. (My CP/M
ML>machine, vintage 1979, used a so-called ASCII terminal that had no way
ML>to input the high-order eighth bit.) Only some word processors of the
ML>time used that bit for anything.
Excuse me, but you've made the same mistake 3 times so far. There is no
such thing as "8-bit" ASCII nor a "the 7-bit version of ASCII". ASCII
is *defined* as a 7-bit set. Any 8-bit set with the lower 128
characters matching ASCII is an "extended ASCII" and *not* any sort of
official variant of ASCII.
It *can* be a standard in its own right, such as the various ISO
8859-xx character sets.
ML>(to some extent) "upward compatible" from CP/M. IMO, the PC fathers did
ML>the best that they could to make ASCII barely usable, supplying both the
ML>missing "upper" 128 characters and also text-mode graphic symbols for
ML>the otherwise useless (by then) ASCII control characters.
ASCII *has* no "upper 128".
And by the time the PC was being designed it was quite clear that the
32-126 range had to be the same as ASCII for a machine to do well in
the market, and that the more common control chars had to be supported.
*Why* they made DEL a printable char I have no idea.
--- Blue Wave/DOS v2.30
* Origin: Shadowshack (1:105/51)
|