RJT> I used to watch as Eudora Light pulled in email and, when it got
RJT> something bigger than about 64k or so, it would split it up. Why?
RJT> Beats the heck out of me...
LE> Easy, dealing with a buffer bigger than 64k is a royal pain
LE> unless you are in protected mode.
I can see where it'd affect your choice of memory model (although you could
always declare a far pointer to your buffer, or whatever) but can't see where
protected mode comes into it. I don't believe that that software kept more
than a message at a time in memory at any given point. Besides, windoze was
already in protected mode...
LE> I'd love to see how some of these authors would deal with
LE> situations I've had to work with in the past. For example, on
LE> an old 8-bit system where the programs only had 32k after the
LE> OS and BIOS, I had to sort records in file that filled a 180k
LE> disk.
What OS are you talking about here?
LE> It was a 2-drive system, and one drive was needed for the OS
LE> disk. So no way to sort to another drive. I cheated, I used a
LE> built-in command for sorting an array in RAM, and just "stepped
LE> thru" the file. Sure, it took multiple passes, but it was a
LE> hell of a lot faster than trying to sort the records one at a
LE> time with the interpreted BASIC!
Heh. I think that there are a lot of programming skills that are either never
used or are in the process of being lost, mostly because of the changes in
hardware. I'll bet that a lot of what's out there could be done, and done a
whole lot smaller, using a whole lot less in terms of system resources, if
programmers were the way that they used to be.
Hell, I ran a PDP-11 system one time that had a *maximum* of 56k (words, not
bytes!) of ram in it, and would support eight users!
|