In a message dated 08-29-99, Murray Lesser said to Roy J. Tellason about
"Copymania"
Hi Murray,
RT>Let's see, I'll bet they used that in that RT-11 OS I
>played with back in '78 or so, do you think? :-)
ML> There is no way that anyone can blame the use of Ctrl-Z as an EOF
ML>mark on any desktop operating system :-).
Well, RT-11 ran on DEC PDP-11 machines, most of which were the size of a
refrigerator, but only served to keep beer hot, not cold.
ML> Ctrl-Z (1fh) has been part of the ASCII 7-bit code since its
ML>beginning (around 1960, IIRC). The first three of the four "separator"
ML>control characters, 1ch through 1eh (, , and ) were
ML>originally intended for data transmission segment separators. IIRC,
ML>Ctrl-Z was the "user-defined separator" to be used in other
ML>environments. IMO, ASCII was originally devised to accommodate AT&T,
ML>who realized that the 5-bit Baudot code (used previously for TTY
ML>systems) was no longer sufficient for "modern" data transmission.
Correct. ASCII was originally conceived as the 7-bit character set to be
used by the paper tape readers attached to the sides of ASR-33 TTY's, mostly
made by ITT.
ML>The
ML>computer manufacturers were snookered into going along with the 7-bit
ML>idea, because (at the time) nobody (except IBM) was contemplating any
ML>computer character codes having more than six bits (and IBM wasn't
ML>making public noises about System/360 at the time).
General Electric and UNIVAC had 9-bit byte (actually 36-bit word) machines
in their labs around the same time IBM was developing the S/360. These later
appeared as the GE-600 series (of MULTICS fame) and the UNIVAC 1100 series.
Both of these machines could switch between 6-bit BCD, 7-bit ASCII and 8-bit
EBCDIC with ease.
ML>Later, Ctrl-Z was a
ML>convention adopted by the computer manufacturers as an EOF mark for the
ML>new 8-bit tape-drive systems.
I think you will find it was [end-of-transmission] in the TTY world,
and that is why we still have it as end-of-file.
ML> ASCII became a computer problem in 1964 with IBM's announcement of
ML>the 8-bit System/360 and its associated 8-bit (plus parity) tape drives.
ML>The ANSI committee met to decide how to "embed" seven-bit ASCII in the
ML>new 8-bit medium. Officially, ASCII is still a 7-bit code: the eighth
ML>(high-order) bit in "true" ASCII embodiments is always zero! The eighth
ML>bit was put to use (for desktop machines) in the IBM PC (1981) with the
ML>IBM-defined "extended ASCII" that made the top 128 characters available
ML>for keyboard input, but has never been adopted as an official standard.
Yes, ASCII was junk then and is still junk now, compared to EBCDIC.
What I could never fathom was why IBM allowed the PC to be developed using
ASCII, when all the other platforms IBM manufactured in the late 1970's used
EBCDIC. The S/370, S/3x, S/7 and S/1 were all EBCDIC boxes, as were all of
IBM's network front-end processors. All the control sequences for BSC and
SDLC communications were EBCDIC. In that context, ASCII made no sense at
all!
It still doesn't.
Regards
Dave
___
* MR/2 2.25 #353 * It works fine except when I am in Windows.
--- Maximus/2 3.01
114/441
292/854
* Origin: Air Applewood, OS/2 Gateway to Essex 44-1279-792300 (2:257/609)
|