Steve,
While this almost certainly isn't quite as convenient as it would be if @FileSeekL did what you want it to do (which it clearly does not), it isn't really hard and I've been doing this for some number of years, and I really don't see any significant reason(s) why you couldn't do it this way. Complete, fully tested and working, example:
Code:
@Echo Off
SetLocal
On ErrorMsg Goto Fini
On Break Goto Fini
Do X In /P Timer On (Set StartTime=%X)
Set FileName="D:\A Reasonably Large File.data"
Do X In /P Timer On (Set CountTime=%X)
Set Size=%@Inc[%@Lines[%FileName]]
UnSetArray /Q Data
SetArray Data[%Size]
Set Handle=%@FileOpen[%FileName, r]
Set IDX=0
Set Line=%@FileRead[%Handle]
Do While "%Line" != "**EOF**"
Set Data[%IDX]=%Line
Set /A IDX+=1
Set Line=%@FileRead[%Handle]
EndDo
Echo >NUL: %@FileClose[%Handle]
Do X In /P Timer /S (Set ReadTime=%X)
Do IDX = 0 To %@Dec[%Size] By 1
@Echo %IDX: %Data[%IDX]
Set /A IDX+=1
EndDo
:Fini
On ErrorMsg
@Echo %Size
Do X In /P Timer Off (Set EndTime=%X)
@Echo Size: %Size
@Echo Start Time: %StartTime
@Echo Count Time: %CountTime
@Echo Read Time: %ReadTime
@Echo End Time: %EndTime
UnSetArray /Q Data
EndLocal
Quit 8
I decided to leave all of the code in, including the "Timers" and "status" messages, etc. (although the "important" code is that between "Set Size=" and "@FileClose" lines); I don't think that they really make the code all that much harder to understand (at least for somebody like
you! ;)) and allowed me to measure performance. And in terms of "real world" performance:
"Real world" times when writing the data out to a file with "Echo":
Code:
Size: 3506
Start Time: Timer 1 on: 8:49:53
Count Time: Timer 1 on: 8:49:53
Read Time: Timer 1 Elapsed: 0:00:32.44
End Time: Timer 1 off: 8:50:31 Elapsed: 0:00:38.25
Which is, if you figure it out, 0.0037 seconds to read each record, which ain't bad by my estimation, and, for what it's worth, 0.00166 seconds to write each record out to a file.
And, for writing it directly to the console:
Code:
Size: 3506
Start Time: Timer 1 on: 8:49:53
Count Time: Timer 1 on: 8:49:53
Read Time: Timer 1 Elapsed: 0:00:32.44
End Time: Timer 1 off: 8:50:31 Elapsed: 0:00:38.25
Which is 0.012 seconds per record, which also isn't bad, although it does show that there can be a lot of variation in processing times depending on what else is going on in the system, I suppose. And, for reference, it took 12.62 seconds to write the data to the console, or .0036 seconds per record; and the difference between the time to write it out to the console rather than the file really can't be determined in any reliable way given the time differences reading the file. (I will note that when I ran the second test writing the data out to the console the data was presumably already in the disk cache which makes the fact that it took so much longer the second time around rather surprising, I think. Bottom line, elapsed times are not all that relevant except over a fairly large number of "samples". Also, my machine is
hardly what one would call
fast, particularly the disk drives.)
And this code is simple enough, even with my semi-blindness and bad memory, that I have no "issues" with it whatsoever.
I will also note that reading the file one time ("%@Lines") was not at all a significant performance issue, so trying to guess what the maximum size of the file would be is to set the size of the array is probably not a worthwhile thing to do.
- Dan