1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

A problem with nested Gosub's/ExecStr's

Discussion in 'Support' started by mathewsdw, Nov 29, 2011.

  1. mathewsdw

    Joined:
    May 24, 2010
    Messages:
    855
    Likes Received:
    0
    I may have mentioned before that the internal hard disk partitions on this laptop are becoming full enough (currently 98.3% in use for the C: partition, 96.7% for the D: partition) that this is starting to become a real problem. So to try to figure out what file(s) I could safely archive to my external hard drive or get rid of entirely I (attempted!) to write a .btm file that went through all of the directories on a given hard drive and report the total file size and total file allocated size for each directory plus each directory and its subdirectories. This batch file started at the "leaves" of each given directory and then worked its way "upward". (You'll understand why this is the best approach if you think about it.) Well, I got said batch file written and tested for my Z: drive (RAM disk) where the directory structure does not go that "deep". However, when I attempted to run it on my C: drive, it bombed. So I wrote a batch file to test specifically that. Here are the complete contents of said batch file:
    Code:
    @Echo Off
    On Error Goto ShowLevel
    SetLocal
    Gosub NestedGosub 1
    EndLocal
    Quit 0
    :ShowLevel
       @Echo  Final Level: %Level
       EndLocal
       Quit 8
    :NestedGosub [Level]
       If %Level GT 200 ^
          Return
       @Echo Nested Level: %Level
       Gosub NestedGosub %@Inc[%Level]
       Return
    
    and here are the complete results of running said batch file:

    Code:
       Tue  Nov 29, 2011   3:51:46p
    
    ISO8601 plugin v1.1.1 loaded.
    SafeChars plugin v1.5.7 loaded.
    
    TCC  12.11.76   Windows 7 [Version 6.1.7601]
    Copyright 2011  Rex Conn & JP Software Inc.  All Rights Reserved
    Registered to Daniel Mathews
    
    [Z:\]NestedGosub
    Nested Level: 1
    Nested Level: 2
    Nested Level: 3
    Nested Level: 4
    Nested Level: 5
    Nested Level: 6
    Nested Level: 7
    Nested Level: 8
    Nested Level: 9
    Nested Level: 10
    Nested Level: 11
    Nested Level: 12
    Nested Level: 13
    Nested Level: 14
    Nested Level: 15
    Nested Level: 16
    Nested Level: 17
    Nested Level: 18
    Nested Level: 19
    Nested Level: 20
    Nested Level: 21
    Nested Level: 22
    Nested Level: 23
     Final Level:
    
    [Z:\]
    
    Well, it is documented that there is a maximum nesting level for Gosub's, but it's also clear that the directory structure on my C: drive is "deeper" than what that limit allows.

    So, second try using "@ExecStr" instead of "Gosub":

    Code:
    @Echo Off
    If Not Exist %2 ^
       Type NUL: >%2
    Echo Level: %1   Output File: %2 >>!%2
    If %1 GT 5 ^
       Quit 0
    On Error Goto ReachedLimit
    Echo  @ExecStr[Z:\ExecStrWithinExecStr %@Inc[%1] %2] >>!%2
    Echo @ExecStr: %@ExecStr[Z:\ExecStrWithinExecStr %@Inc[%1] %2] >>!%2
    Echo Level: %1   Output File: %2
    Quit 0
    :ReachedLimit
       @Echo Reached the limit of %1 >CON:
       Quit 8
    
    And here are the (somewhat surprising to me!) results (in ExecStrWithinExeStr.txt) of running the above batch file:

    Code:
    Level: 1   Output File: ExecStrWithinExecStr.txt
     @ExecStr[Z:\ExecStrWithinExecStr 2 ExecStrWithinExecStr.txt]
    Level: 2   Output File: ExecStrWithinExecStr.txt
     @ExecStr[Z:\ExecStrWithinExecStr 3 ExecStrWithinExecStr.txt]
    Level: 3   Output File: ExecStrWithinExecStr.txt
     @ExecStr[Z:\ExecStrWithinExecStr 4 ExecStrWithinExecStr.txt]
    Level: 4   Output File: ExecStrWithinExecStr.txt
     @ExecStr[Z:\ExecStrWithinExecStr 5 ExecStrWithinExecStr.txt]
    Level: 5   Output File: ExecStrWithinExecStr.txt
     @ExecStr[Z:\ExecStrWithinExecStr 6 ExecStrWithinExecStr.txt]
    Level: 6   Output File: ExecStrWithinExecStr.txt
    @ExecStr:
    @ExecStr: Level: 5   Output File: ExecStrWithinExecStr.txt
    @ExecStr: Level: 4   Output File: ExecStrWithinExecStr.txt
    @ExecStr: Level: 3   Output File: ExecStrWithinExecStr.txt
    @ExecStr: Level: 2   Output File: ExecStrWithinExecStr.txt
    
    Notice that at level 6 the "@ExecStr" function evidently failed given that there was no output of any kind.

    And, just to be complete, here is the complete TCC session:
    Code:
       Tue  Nov 29, 2011   5:03:30p
    
    ISO8601 plugin v1.1.1 loaded.
    SafeChars plugin v1.5.7 loaded.
    
    TCC  12.11.76   Windows 7 [Version 6.1.7601]
    Copyright 2011  Rex Conn & JP Software Inc.  All Rights Reserved
    Registered to Daniel Mathews
    
    [Z:\]ExecStrWithinExecStr 1 ExecStrWithinExecStr.txt
    Level: 1   Output File: ExecStrWithinExecStr.txt
    
    [Z:\]
    
    Since this is a high-priority task, I will next attempt to do it using the "Start" command, and I don't see any reason in advance why that won't work (although I don't know, off of the top of my head, what the proper parameters to the "Start" command will be so I will have to do some research/experimentation). But I tend to think that, at minimum, the Gosub "nesting" level should be increased.
     
  2. Charles Dye

    Charles Dye Super Moderator
    Staff Member

    Joined:
    May 20, 2008
    Messages:
    3,300
    Likes Received:
    39
    It's not very relevant to your question, but have you noticed the TREE command's /Z option?
     
  3. Steve Fabian

    Joined:
    May 20, 2008
    Messages:
    3,520
    Likes Received:
    4
    Did you try *DIR/S/U ?
    The reporting format is not what you want (too verbose), but IMHO it provides all the information you want...
    --
    Steve
     
  4. mathewsdw

    Joined:
    May 24, 2010
    Messages:
    855
    Likes Received:
    0
    Yes, Charles, I had, but not in quite a while. My (admittedly not very reliable) recollection was that it was only part of what I wanted, and looking at it again just now per your suggestion I was right. It only gives me a (somewhat large, I will admit) part of what I want. And, while I wouldn't exactly call it "convenient" (particularly since I have to use temporary files to "communicate" between the various "started" sessions), that works and, of course, gives me exactly what I want because I wrote it. (I've been accused in the past of going well beyond the "call of duty", so to speak, and that's probably right. However, I want what I want.)
     
  5. mathewsdw

    Joined:
    May 24, 2010
    Messages:
    855
    Likes Received:
    0
    Steve, as they say it comes close but it's no cigar. I was dimly aware of it, but didn't think that it was all that I needed, and I was right. As I told Charles, while I wouldn't exactly call it "convenient" (particularly since I have to use temporary files to "communicate" between the various "started" sessions - but that's really not at all a noticeable performance issue because all of these temporary files are stored on a RAM disk), that works and, of course, gives me exactly what I want because I wrote it. (Again, I've been accused in the past of going well beyond the "call of duty", so to speak, and that's probably right. However, I want what I want.)

    And the simple truth IMHO is that the nesting limits of both Gosub's and ExecStr's are really not technically needed in today's world where the typical user has multiple Gigabytes of RAM. I don't have a particularly high-end laptop (in fact, it is, as far as I can tell, only capable of 32-bit processing), but even it has almost 4G of RAM. (And the simple truth is at this moment I have 24 "apps" running, and that number does not even include the multiple tabs in either the multiple Take Command or FireFox sessions. And that number is fairly typical if not on the low end (I tend to keep things "open" until I am absolutely, without a doubt, "done" with them because of my memory issues), and I've never seen even the slightest indication that the machine has to page to and from disk in anything close to "normal: situations - it has happened, but very rarely.
     
  6. Charles Dye

    Charles Dye Super Moderator
    Staff Member

    Joined:
    May 20, 2008
    Messages:
    3,300
    Likes Received:
    39
    There is also the @FILESIZE function, which accepts both wildcards and a /S option to recurse. Calling this repeatedly is, admittedly, not as elegant as starting from the innermost subdirectories and working your way out; but it's probably a lot easier to write!
     
  7. samintz

    samintz Scott Mintz

    Joined:
    May 20, 2008
    Messages:
    1,187
    Likes Received:
    11
    Daniel,

    Are you aware that DIR already reports that information? Look up the help
    for the /U switch.

    DIR /S /U
    DIR /S /U1
    DIR /S /U2

    -Scott


    mathewsdw <> wrote on 11/29/2011 06:16:06 PM:


     
  8. samintz

    samintz Scott Mintz

    Joined:
    May 20, 2008
    Messages:
    1,187
    Likes Received:
    11
    Dan,

    Earlier versions of 4NT and TCC did not have a nesting limit and when
    memory was exhausted they would crash and burn. The current
    implementation allows for a graceful recovery. Even though you may have
    4GB of RAM in your PC, Windows is only using 3GB of it if you are using a
    32 bit OS. If you really want to see a clear picture of memory usage run
    ProcessExplorer from SysInternals.

    What data is *not* reported by the DIR /U, /U1, and /U2 switches that you
    require?

    -Scott




    Steve, as they say it comes close but it's no cigar.

    And the simple truth IMHO is that the nesting limits of both Gosub's and
    ExecStr's are really not technically needed in today's world where the
    typical user has multiple Gigabytes of RAM. I don't have a particularly
    high-end laptop (in fact, it is, as far as I can tell, only capable of
    32-bit processing), but even it has almost 4G of RAM. (And the simple
    truth is at this moment I have 24 "apps" running, and that number does not
    even include the multiple tabs in either the multiple Take Command or
    FireFox sessions. And that number is fairly typical if not on the low end
    (I tend to keep things "open" until I am absolutely, without a doubt,
    "done" with them because of my memory issues), and I've never seen even
    the slightest indication that the machine has to page to and from disk in
    anything close to "normal: situations - it has happened, but very rarely.
     
  9. mathewsdw

    Joined:
    May 24, 2010
    Messages:
    855
    Likes Received:
    0
    Scott, first, I will note again that this is no longer a problem because I have solved it (again, using the "Start" command to completely circumvent those limits). And, as far as the RAM goes, Process Explore reports that I have 2.75 Gig of physical RAM (possibly, as you say, as much as 32-bit Windows can see), of which .57 Gig free (not much, I'll admit, but it doesn't seem to be a problem because, again, I have 24 apps running at this moment (the largest memory user being FireFox at 82.5M or .08G), and, somewhat surprisingly to me, although Process Explorer says I have a "page-fault delta" consistently around about 6K, I would estimate (it goes as low as just below 3K and as high (rarely) as something just over 12K, rather strangely because the only app that I am running that is using any significant CPU time a is TCC session that is doing a "significant" (related) task and the screen-magnifier app that I need because of my vision issues) the “page read” delta is mostly 0 and sometimes goes as high a 3 (meaning “page faults” without any corresponding paging “in”?, and the disk activity light is only flashing once in a while (is the “paging” to and from the physical memory that 32-bit Window’s can’t “see”?) and why any paging at all (if the page-fault delta shown by Process Explorer is what I at least thought it was) if I have half a Gig (seems to me to by quite a bit) free?)) And, as to “what data not reported”, the data I am gathering is split up into several “categories” (things such as .exe’s/.dlls as the “primary” categories)) and while gathering this data would certainly be possible using multiple “Dir *.xxx /S /U” commands run one after the other and “combining the data, the output of the “Dir /U” command would also have to be substantially “parsed” because it is not at all in the format/”order” than I want.

    And I am aware of and have the latest “versions” of all of the “System Internals” utilities. And, as an aside related to both Process Explorer and the “Handle” program, neither of them works at all reliably under Windows 7 (and this has been verified by at least one other user), which is very unfortunate for me because I heavily relied on them because of my bad-memory issues in particular, and, “As the licensing FAQ on the site says: Q: Is there technical support available for the Sysinternals tools? A: No. All Sysinternals tools are offered 'as is' with no official Microsoft support.” This came from a substantial “discussion” I had on the web (http://forum.sysinternals.com/miscellaneous-utilities_forum11.html - topic “Problem with the "Handle" command under Windows 7.), and if you bother to read that posting in any detail you will see why I have a real need for that functionality so it looks like I am out of luck for the indefinite future.

    So, the bottom line IMHO is that while that even if that was a problem in the past it should not be (at least very often) now because ample memory is available and, in any case, its "graceful" recovery IMHO should be quite a bit more "graceful".

    - Dan
     
  10. mathewsdw

    Joined:
    May 24, 2010
    Messages:
    855
    Likes Received:
    0
    Charles, not only would I call it not elegant, I would call it absolutely horrible from a performance standpoint. Yes, this could be put in a "For" loop with the "/R" option, but counting the contents of directories near and at the bottom of the tree many times would be pretty bad if not entirely impractical from a time stand point. That was why I made the statement "You'll understand why this is the best approach if you think about it." in my original posting. (In fact, it was my "tentative" first try and I Ctrl-C'd it afte about 10 minutes of waiting on my C: drive and the current ("Start" commands) only takes about 5 minutes for the same drive.) Not a bad idea in principle, but IMHO essentially impractical in practice.

    - Dan
     
  11. rconn

    rconn Administrator
    Staff Member

    Joined:
    May 14, 2008
    Messages:
    9,854
    Likes Received:
    83
    The limit in GOSUB is to prevent users from writing infinite loops (which
    they do with great enthusiasm). There isn't a direct limit on @EXECSTR, but
    you're probably running into the 20-loop alias limit. (Or running out of
    memory -- you're going to have less than 1Gb of memory actually available to
    you if you're running 32-bit Windows, regardless of the amount in your
    system.)

    What you asked for in your original message can be easily accomplished with
    TREE /Z or one of the DIR /Un options. Since you don't want to use either
    of those, you presumably have an additional goal that you haven't mentioned
    yet. If you can explain why TREE /Z and DIR /Un don't meet your needs, we
    can probably suggest an alternate solution.
     
  12. mathewsdw

    Joined:
    May 24, 2010
    Messages:
    855
    Likes Received:
    0
    Scott, please see my substantial response to this issue that I made to your posting on this thread at 12:47. Thank you.

    - Dan
     
  13. mathewsdw

    Joined:
    May 24, 2010
    Messages:
    855
    Likes Received:
    0
    Rex,

    First off, I "solved" the problem (somewhat inconveniently) by using the "Start" command.

    Secondly, at this moment I have 1,760MB of physical memory in-use, 995MB available, for a total of 2,759MB or 2.68GB, not a guess, these are the numbers reported by the Windows 7 "Memory Tab" of the Task Manager "Resource Monitor" (a button on the Task Manager "Performance" tab. I don't have to tell you that that is 2.68 times the 1-Gigabyte limit you mention above.

    And, as I mentioned in previous posts, "Dir /U" and "Tree /Un" didn't meet my (desired) needs "out of the box", so to speak. And, again, figuring out how to make them "meet my (desired) needs" is not, strictly speaking, necessary given that I have solved the problem (in exactly the way I wanted) by using the "Start" command. However, I would be interested in "alternate solutions" strictly out of curiosity and an opportunity to learn something that I don't already know (although, given my severe memory issues, that might not really be worthwhile...)
     
  14. rconn

    rconn Administrator
    Staff Member

    Joined:
    May 14, 2008
    Messages:
    9,854
    Likes Received:
    83
    That's not relevant -- I'm referring to the memory actually availalble for allocation (via VirtualAlloc) within the TCC process. (Which can never theoretically exceed 2Gb minus the code size in 32-bit Windows, and in practice is more like 1Gb or less.)

    Since you still haven't said *why* they don't meet your needs, it's not possible to provide any alternate solutions.
     
  15. mathewsdw

    Joined:
    May 24, 2010
    Messages:
    855
    Likes Received:
    0
    Rex, that is true, and I didn't say "why" because it was, as far as I was concerned, a "solved" problem. However, what I wanted (and ultimately got) was a report, sorted in "reverse" order on any given one of multiple different fields, that contained, for every directory on a drive (or every subdirectory of a given directory on a drive) a line showing, for that directory, the number "executable" files in that directory, the number of non-executable files in that directory, the total number of files in that directory, the total size of the "executable" files in that directory, the total size of non-executable files in that directory, the total size of all of the files in that directory, and (most importantly) the total allocated size of the "executable" files in that directory, the total allocated size of the non-executable files in that directory, the total allocated size of all of the files in that directory, and, finally, all of the previous for both that directory and all of its subdirectories. And this is what I ultimately was able to achieve, although, again, I had to repeatedly start new instances of that batch file in "started" TCC sessions (invisible windows and close when finished) that “passed” their results back to the batch files that “started” them in a temporary file allocated by the batch file that "started" them.

    And, yet again, that is what I was able to achieve.

    - Dan
     

Share This Page