(Excerpts from a message dated 08-31-99, David Noon to Murray Lesser)
Hi David--
ML> ASCII became a computer problem in 1964 with IBM's announcement of
ML>the 8-bit System/360 and its associated 8-bit (plus parity) tape drives.
ML>The ANSI committee met to decide how to "embed" seven-bit ASCII in the
ML>new 8-bit medium. Officially, ASCII is still a 7-bit code: the eighth
ML>(high-order) bit in "true" ASCII embodiments is always zero! The eighth
ML>bit was put to use (for desktop machines) in the IBM PC (1981) with the
ML>IBM-defined "extended ASCII" that made the top 128 characters available
ML>for keyboard input, but has never been adopted as an official standard.
DN>Yes, ASCII was junk then and is still junk now, compared to EBCDIC.
DN>What I could never fathom was why IBM allowed the PC to be developed
>using ASCII, when all the other platforms IBM manufactured in the
>late 1970's used EBCDIC. The S/370, S/3x, S/7 and S/1 were all EBCDIC
>boxes, as were all of IBM's network front-end processors. All the
>control sequences for BSC and SDLC communications were EBCDIC. In
>that context, ASCII made no sense at all!
DN>It still doesn't.
Nowadays, the Average Idiot Home User hasn't heard of any character
code other than ASCII, if even that :-(. As you well know, the stupid
collating sequence of ASCII is due to the great desire of AT&T to pack
everything they thought anybody would ever want into seven bits, while
letting the presence of a single bit differentiate between lower- and
upper-case alpha characters (a desirable characteristic only for fully
mechanical terminals). Making sense (from a computer-architectural
standpoint) has never been a requirement for standards committees! (I
have a great dislike, based on many years of past participation, of most
computer-related standardizing activities. The standards committee
members seem to be more interested in showing how smart they are than in
following Euripides' "legacy" law of computer architecture: "The gods
visit the sins of the fathers upon the children.")
As to why the PC architecture included the use of ASCII as an
internal character code: since I had nothing to do with the IBM PC
development, I have only some conjectures. The PC was developed by a
semi-independent group in Florida. The then upper management of IBM
didn't believe it was ever going to go anywhere, so probably didn't care
that the original perpetrators were using their 8-bit version of ASCII
instead of EBCDIC! The character-code choice may have had something to
do with the fact that the 7-bit version of ASCII (embedded in eight
bits) was being used in most desktop machines up to that date. (My CP/M
machine, vintage 1979, used a so-called ASCII terminal that had no way
to input the high-order eighth bit.) Only some word processors of the
time used that bit for anything.
The 8088/86 instruction set was derived from the 8080 set, and two
of the three operating systems originally offered for the IBM PC were
(to some extent) "upward compatible" from CP/M. IMO, the PC fathers did
the best that they could to make ASCII barely usable, supplying both the
missing "upper" 128 characters and also text-mode graphic symbols for
the otherwise useless (by then) ASCII control characters. We OS/2
children are still living with the results of their sins.
Regards,
--Murray
___
* MR/2 2.25 #120 * Old engineering adage: there is more than one way to skin
a cat
--- Maximus/2 2.02
* Origin: OS/2 Shareware BBS, telnet://bbs.os2bbs.com (1:109/347)
|