From: d83@ath.forthnet.gr (Don Schullian)
Subject: Re: Need help with binary file question...
> I am new to binary files. I am trying to read a file which is over 15 meg
n
> size. Each record is 404 bytes long. When I read 82 records into the file,
> get an error #15 (string length too long) using this command "GET$ 1,
Position,
> A$"
> When A$ exceeds the 32750 limit, then the error pops up. Is there some way
to
> reset A$ back to 0 and read another 32750 bytes into file from the point
where
> the first seek ended.
Sounds like you're mixing apples and oranges, which is fine as long as you've
got a good reason. If you've a file that has a known record length then I'd
stick to RANdom file access. Much easier to handle. If, however you're
attempting to, say, copy the file then the record length makes no difference
so don't worry about it....
OPEN "B", #1, FileName$
BytesLeft& = LOF(1)
WHILE BytesLeft& > 0
Chunk% = MIN(32000,BytesLeft&)
DECR BytesLeft&, Chunk%
GET$ #1, Chunk%, A$
' ... do what you want with a$ here
WEND
This will get a fresh 32000 bytes and stuff them into A$ until you hit the
nd
of the file as long as you do not access the file from anywhere else for any
other purpose. If you must do that then you'll have to keep track of where in
the file you got the last chunk from.
____ _ ____ ____ _____
| _ \ / \ / ___) __ | ___)(_ _)
| |_) / _ \ \____\/ \| _) | |
|____//_/ \_\(____/\__/|_| |_|
Reply to: d83@ath.forthnet.gr
www.basicguru.com/schullian
*** QwkNews (tm) v2.1
* [TN71] Toast House Import
--- GEcho 1.20/Pro
---------------
* Origin: Toast House Remote (1:100/561)
|