TM> Don't do this! An input stream isn't supposed to throw an exception on
TM> end of file. There is no good reason for it, since eof isn't an
TM> exceptional situation.
I've got a dozen lines of redundant code that say otherwise. Try
writting a reasonable tokenizer without it and you'll see why I need
to do this.
Let's pretend I have a simple language and everything is delimited by
space.
Given situation: get a character and build a token until white space
is reached. White space can be an actual space, tab, or newline
character.
No token can be over MAX_TOKEN_LEN in length.
If you don't check for end of file you will run into an infinite loop.
// assume inf = ifstream
char c;
do {
inf.get(c);
if (inf.gcount() != 0) {
if (current < MAX_TOKEN_LEN) {
token[current++] = c;
}
}
} while (((strrchr(" \t\n")==0) && (inf.gcount() != 0) &&
(current < MAX_TOKEN_LEN));
if ((inf.gcount() != 0)) {
inf.putback(c);
}
As you can see, this is a nasty piece of so called structured code.
If MyIfstream would trow an exception, I could eliminate almost have
of the redundant checks.
Notice that there will be _only_ one end of file reached per file.
If I check every character read from the file 3 times I have a slow,
slow, slow piece for code.
I need one piece of code, to be run once, be one exception: not several
lines of code to check every character, every time.
Note, I agree with you that the original ifstream should not be made
to act in the prescribed way; however, I need such a specialized class
in this case.
--- GEcho 1.00
---------------
* Origin: Digital OnLine Magazine! - (409)838-8237 (1:3811/350)
|