Welcome!

By registering with us, you'll be able to discuss, share and private message with other members of our community.

SignUp Now!

How to? Is there any way to get @Files to report...

May
855
0
the number of files in a directory minus the dot and double-dot entries if trying to determine the total number of directories (and possibly files) in a directory? Or maybe to be more precise about exactly what I am trying to accomplish, I would like @Files (or a function that I write to "replace" it) to return the same number for the "root" of a "genuine", physical drive (or RAM disk) as it does for a drive letter which is Subst'd to a subdirectory of a drive which means that the those two pseudo-directories would should not be counted (although the only file name "patterns" that I am aware of that would count them are "*" and "." assuming that the other parameter(s) are such that directories are being counted at all, although that doesn't help as far as I can tell because any "pattern" that would exclude the dot and double-dot entries would exclude other directory entries as well). (I think the reasons for this have been "addressed" on this forum in the past as to why there is no "equivalent" of the "/H" parameter that is available for both the "Dir" and "PDir' commands (but with precisely "opposite" meanings) for the "@Files" function, but as far as I can tell that is the case.) And I am not aware of any file-name pattern that would count all of the directories in a directory except for the dot and double-dot entries, just as there are also no "attributes" that I can think of that would exclude just the dot and double-dot entries, either. And in terms of subst'd drives, I can't really assume that a subst'd drive letter will be "connected to" a subdirectory of another drive rather than the root directory of another drive. While I wouldn't call this exactly "vital", I do consider important for the application that I intend to use it for.

Just as a note on why I would ever subst one drive letter to the root directory of another drive letter, that is now the completely "normal" situation for me. That is because, as I have often mentioned on this forum previously, I use and am heavily dependent on in a whole number of ways on a RAM disk, which I have been using for a period of several years; and when I installed the RAM-disk software I chose "Z" as the drive letter both because it would be the easiest to remember and the most "out of the way", so to speak. Well I had a need to get and install the latest version of that software in recent history (I no longer remember exactly why; bad memory again) and the latest version of the software did not let me assign a drive letter of my choice in the installation procedure, unlike the previous version. (In fact, it "came up" as drive "J:", the next available drive letter, and I could find no way to change that drive letter to anything else.) Well, this was simply put, totally unacceptable to me both because I had years of habits that made me automatically type "Z:" in various situations and the fact that I had a fair number of batch files (and C++ programs) that had Z: intentionally hard-coded in them because I absolutely did not want them to do anything to a disk that wasn't a RAM disk because, simply put, they could be very destructive and that "destruction" wouldn't really be that big of deal (other than being possibly somewhat inconvenient) if done to my RAM disk whereas it could be catastrophic if done to a real, physical, hard disk. So my "solution" to the above was simple: I have code in my Windows "Startup" procedure that assigns drive letter Z: to drive letter J:, both "administratively" and not administratively ("administrative" drive letters are not "seen" by non-administrative tasks, and vice versa), so the fact that drive "Z:" is really drive "J:" is almost completely "transparent" to me. (I have a couple of (C++) programs that I've written over the years that "translate" subst'd drive letters to their "real" locations for various reasons (mostly having to do with "safety" and/or "paranoia"), and while I suppose I could "steal" that code for this purpose, I'd really rather not because I am trying to get out of the C++ "habit" entirely.)
 
Hi,

@files seems to lack a switch for "hide dots", indeed.
Try to redirect "dir" to a file and count the lines:

Code:
C:\Program Files\JPSoft\TCMDx64 >dir /a:d /h /b c:\ > clip:
 
C:\Program Files\JPSoft\TCMDx64 >type /l clip:
  1 : $Recycle.Bin
  2 : Apps
  3 : boot
  4 : Documents and Settings [C:\Users]
  5 : Intel
  6 : MSOCache
  7 : Oxxxx
  8 : PerfLogs
  9 : Program Files
  10 : Program Files (x86)
  11 : ProgramData
  12 : Programme
  13 : Programme (x86)
  14 : Recovery
  15 : System Volume Information
  16 : Temp
  17 : Users
  18 : usr
  19 : Windows
 
C:\Program Files\JPSoft\TCMDx64 >echo %@inc[ %@lines[clip:] ]
19
 
You brought up a closely related issue on 2011-03-06 in this (Support) forum: how to use @FILES[] to count directories in a the (sub)tree rooted at %D, without including the . and .. directories. My suggestion was: %@FILES[/s "%d",d]. My suggestion for the current question
%@eval[ %@files[/s "%d",d] + %@files[/s "%d",-d] ]
in other words, sum the count of directories and the count of files.

Alternately, %@execstr[4,*dir/s/h/u2/a: %x] returns the string which includes file and directory counts, and %@word[" ",XXX,count_string] extracts the file and directory counts, when XXX is 3 and 6, resp. Both methods are much faster than Frank's suggestion (counting lines in DIR command output), but intuitevely the first of the methods (summing @FILES - returned file and directory counts) is the faster. HTH.
 
Steve, I do not understand. "Echo %@Files[.,d]" shows 2 when when "asking about" a subst'd drive that is "anchored" to a subdirectory, not 0! Although your second, more "complicated" suggestion, works; but I do wonder about the "performance impact" of the re. Frank's solution which does not require @ExecStr. (Because this machine ain't exactly the fastest, I do worry about performance to some degree but maybe unnecessarily.) - Dan
 
And Frank, thank you as always. Your solution works perfectly. (I was doing the same thing except to a temporary file that I then deleted after counting the lines in it but that was a real pain so that's why I was hoping there was a better solution, and of course there was.) Thank you again - got rid of some really ugly code!:)

Oh, while I was dimly aware of "CLIP:", I've never had a reason to use it meaning that I didn't think of it.

Frank, please see the last posting below because I can't guarantee that you will read this, and I feel that it might be kind of important if you've already read my previous updates to this entry.
 
Steve, I do not understand. "Echo %@Files[.,d]" shows 2 when when "asking about" a subst'd drive that is "anchored" to a subdirectory, not 0! Although your second, more "complicated" suggestion, works - Dan
On my WinXP SP3 NTFS drive, I just created a SUBST to the root of my work drive, and another to its \JPSOFT directory. Whether my CWD was the root of the SUBST drive or the real drive, %@files[.,d] and %@files["*",d] always returned the came directory count. Likewise, %@files[/s "*",d] returned the same count for whether the CWD was the explicit F:\JPSOFT, or Y:\ with truename of F:\JPSOFT. Maybe your test location has 2 hidden subdirectories? I could not find an incorrect report from %@eval[ %@files[/s "%d",d] + %@files[/s "%d",-d]] ...
 
Steve, that is not the case for what I am trying to "use" this for. And you will quickly see the (important!!!) difference between yours and mine if you look at the below:
Code:
  Tue  Jan 24, 2012  10:57:03a
 
ISO8601 plugin v1.1.1 loaded.
SafeChars plugin v1.6.1 loaded.
Sift v0.55.0 loaded.
 
TCC  12.11.76  Windows 7 [Version 6.1.7601]
Copyright 2011  Rex Conn & JP Software Inc.  All Rights Reserved
Registered to Daniel Mathews
 
[Z:\]MD NotHidden /D
[Z:\NotHidden]Subst Y: .
[Z:\NotHidden]Y:
[Y:\]Echo %@files[/s "*",d]
0
[Y:\]Echo %@files["*",d]
2
[Y:\]
The problem is that I absolutely do not want to count the contents of subdirectories, doing so would completely invalidate this for the purpose I am trying to use it for.

So, good idea in the general case but not in this specific case.

- Dan
 
Frank, it's taking more than 60 seconds in (using "Timer On" and "Timer Off" so its not an estimate) in one of the primary circumstances in which I intend to use this batch file. I thought it was the "/B" parameter (because I didn't remember it taking so long before I added it). I thought it could be a number of other things that I tried all of one at a time in a row and at this point I haven't been able to isolate exactly where that time is being "consumed" and I can't continue to experiment with it any further because I had to leave about 10 minutes ago to get to an appointment I have. But I will look into it further when I get back, but I really doubt at this point that it is something anybody can do anything about because I'm rather sure that it is related, in some way, to what I consider to be required functionality. But thank you, and I'll report back once I've identified the cause just in case you are curious.

- Dan
 
Code:
C:\Temp >logparser -i:fs -recurse:0 "select count(name) from *.* where substr(name,0,1) <> '.'"
COUNT(ALL Name)
---------------
18
 
Statistics:
-----------
Elements processed: 20
Elements output:    1
Execution time:    0.03 seconds
 
C:\Temp >logparser -q:on -i:fs -recurse:0 "select strcat('set count=',to_string(count(name))) from *.* where substr(name,0,1) <> '.'"
set count=18

Logparser is really fast even for huge directory-trees.

:rolleyes: don't take it for serious - we're in the jpsoftware forum ;)
 
You brought up a closely related issue on 2011-03-06 in this (Support) forum: how to use @FILES[] to count directories in a the (sub)tree rooted at %D, without including the . and .. directories. My suggestion was: %@FILES[/s "%d",d]. My suggestion for the current question
%@eval[ %@files[/s "%d",d] + %@files[/s "%d",-d] ]
in other words, sum the count of directories and the count of files.

Alternately, %@execstr[4,*dir/s/h/u2/a: %x] returns the string which includes file and directory counts, and %@word[" ",XXX,count_string] extracts the file and directory counts, when XXX is 3 and 6, resp. Both methods are much faster than Frank's suggestion (counting lines in DIR command output), but intuitevely the first of the methods (summing @FILES - returned file and directory counts) is the faster. HTH.

Hello Steve,

I know this wasn't really sharp, just qad to help Dan for the moment.

best regards
Frank
 
Code:
C:\Temp >logparser -i:fs -recurse:0 "select count(name) from *.* where substr(name,0,1) <> '.'"
COUNT(ALL Name)
---------------
18
 
Statistics:
-----------
Elements processed: 20
Elements output:    1
Execution time:    0.03 seconds
 
C:\Temp >logparser -q:on -i:fs -recurse:0 "select strcat('set count=',to_string(count(name))) from *.* where substr(name,0,1) <> '.'"
set count=18
SWEET!
 
Well, Frank, the mystery has been totally solved. The following code:
Code:
  @EchoS Start of *Dir to Clip: to count files: `` >CON:
  Timer On >CON:
  *Dir %@Replace[/B,,%@Replace[/M,,%@Replace[/K,,%ParameterString]]] /B >!Clip: >&>NUL:
  Set Lines=%@Inc[%@Lines[Clip:]]
  @EchoS  End of *Dir to Clip: to count files: `` >CON:
  Timer Off >CON:
produces this:
Code:
Start of *Dir to Clip: to count files: Timer 1 on: 17:05:04
  End of *Dir to Clip: to count files: Timer 1 off: 17:05:27  Elapsed: 0:00:22.86
whereas this code:
Code:
  @EchoS Start of *Dir to temporary file on RAM disk to count files: `` >CON:
  Timer On >CON:
  Set CountFileName="Z:\Temp\%@BaseFileName[].%_PID.txt"
  *Dir %@Replace[/B,,%@Replace[/M,,%@Replace[/K,,%ParameterString]]] /B >!%CountFileName >&>NUL:
  Set Lines=%@Inc[%@Lines[%CountFileName]]
  Del %CountFileName >NUL:
  @EchoS  End of *Dir to temporary file on RAM disk to count files: `` >CON:
  Timer Off >CON:
and executing this:
Code:
DirPlus >TempFileQPWZ.txt
produces this:
Code:
Start of *Dir to temporary file on RAM disk to count files: Timer 1 on: 17:23:46
  End of *Dir to temporary file on RAM disk to count files: Timer 1 off: 17:23:46
Elapsed: 0:00:00.14
And entering this:
Code:
Echo %@Eval[22.86/.14]
produces this:
Code:
163.28571428571429
And finally, this:
Code:
[Z:\]fc ClipQPWZ.txt TempFileQPWZ.txt
produces this:
Code:
Comparing files ClipQPWZ.txt and TEMPFILEQPWZ.TXT
Comparing files ClipQPWZ.txt and TEMPFILEQPWZ.TXT
***** ClipQPWZ.txt
  End of *Dir to count files:  1/24/2012  17:24
 1/24/2012  17:24           5,294  DirPlus.btm
 1/24/2012  17:23          55,353  TempFileQPWZ.txt
 1/24/2012  12:27         567,598  PWQRTS.txt
***** TEMPFILEQPWZ.TXT
  End of *Dir to count files:  1/24/2012  17:23
 1/24/2012  17:17           5,294  DirPlus.btm
 1/24/2012  12:27         567,598  PWQRTS.txt
*****

The bottom line is that, while using CLIP: might seem to be a very good idea, in practice it isn't given that using a "temporary" file is 163 times faster in this particular case despite the fact that there are two more lines of code.

Two notes that may be significant: Again, Z: is a RAM disk; and "@BaseFileName" is a function that I wrote and use almost constantly that "delivers" the "base file name" of its input parameter where the "base file name" is defined as being whatever name is given to the function stripped of its path and extension, and if the input argument is not specified (as in this case) it defaults to the name of the batch file ("%0") that invokes it. I use it both for error reporting ("blahblah" ended due to error.) and for generating temporary file names (particularly with the "_PID" internal variable appended to it) to create temporary files that absolutely avoid name "collisions". And yes, I could use "@Unique", but I prefer this because the file name is a constant for any given series of runs of a batch file that invokes the "@BaseFileName" function in the same TCC session which I find handy for "debugging" purposes. (Not that it really matters all that much, but the "tabs" of my Take Command windows contain the "PID" of that particular TCC session.) And the definition of the "@BaseFileName" function is:
Code:
BaseFileName=%@FileName[%@Left[-%@Inc[%@Len[%@Ext[%@If["%@UnQuote[%1]" == "",%_BatchName,%@UnQuote[%1]]]]],%@If["%@UnQuote[%1]" == "",%_BatchName,%@UnQuote[%1]]]]
I will "spare you" from my "detailing" how the above works because I'm rather sure you can figure it out for yourself if you are interested.

- Dan

P. S. If anyone is curious what I am attempting to do in this batch file, it's relatively simple. You see, I find myself entering the command:
Code:
dir <optional file-name pattern> /K /M /H /A-D /O-D | List
literally dozens of times in any given day. (And yes, I know that the "/H" is unnecessary with "/A-D", but I do it strictly out of habit and it's a habit I've got no interest in breaking.)

And I wanted to make a batch file with the following characteristics:

1. "/H" is used on the "internal" "Dir" command unless "/H" is coded when invoking this new command. (I really never want to look at the dot and double-dot directory entries, but I'm allowing the possibility that some day I might.)

2. The "effects" of "/K" and "/M" are"inverted" unless "/S" is also coded, in which case they retains their "normal" meanings.

3. The "List" command may be executed on the results of the internal "Dir" command unless "/NOL" (no-list) is coded when invoking the command.

3. The "List" command is always executed on the results of the internal "Dir" command when "/LI" ("List", of course) is coded when invoking the command.

4. If neither "/NOL" nor "/LI" are coded, the output of the dir command is "fed" to the list command unless there are 14 or fewer output lines (this is why I wanted an exact count of the number of output lines in the first place) in which case it is written directly to the console or whatever the output of the command is "piped" to.

5. Unless otherwise specified (by coding the "/O" parameter when invoking the command), the output is sorted in reverse-order by date (so the newest is at the top of the list) if the output of the internal "dir" command is being piped to the "List" command and by date if the (newest last) if the output is going to the console.

6: "/A-D" is the default unless "/A" is coded when invoking the command.

7. All other parameters have their "usual" meanings if coded.

And I have an alias:
Code:
Dir=`D:\DOS\DirPlus %$`
(I don't know if the "%$" is required and I've never taken the time to "investigate" that because it doesn't hurt anything if there and is only three extra characters).

And I also have alias:
Code:
NDir=`*Dir %$`
where the "N" is for "native" (it's easier than typing an asterisk when invoking the "native" "Dir" command).

And even though I'm not sure that it is completely "done" at this point, I really like it as it now stands and if I want to use the "regular" dir command for whatever reason I simply use "NDir" instead of "Dir", and even I can remember that!:)
 
I haven't followed this thread closely but the apparent slowness of the clip: pseudo-device is striking. I did this little test several times so cacheing is not a concern. "S:" is "z:\windows\system32".

Code:
v:\> timer & dir s:\ > clip: & echo %@lines[clip:] & timer
Timer 1 on: 21:25:54
2929
Timer 1 off: 21:32:01  Elapsed: 0:06:06.45
 
v:\> timer & dir s:\ > sdir.txt & echo %@lines[sdir.txt] & timer
Timer 1 on: 21:32:13
2929
Timer 1 off: 21:32:13  Elapsed: 0:00:00.18

Should it be **that** slow?
 
I haven't followed this thread closely but the apparent slowness of the clip: pseudo-device is striking. I did this little test several times so cacheing is not a concern. "S:" is "z:\windows\system32".

Code:
v:\> timer & dir s:\ > clip: & echo %@lines[clip:] & timer
Timer 1 on: 21:25:54
2929
Timer 1 off: 21:32:01  Elapsed: 0:06:06.45
 
v:\> timer & dir s:\ > sdir.txt & echo %@lines[sdir.txt] & timer
Timer 1 on: 21:32:13
2929
Timer 1 off: 21:32:13  Elapsed: 0:00:00.18

Should it be **that** slow?

Further investigation shows that "echo %@lines[clip:]" takes almost all of that time (over 6 min 6 sec). Is there something wrong with it? A little test app shows it can be done (in the situation above) in 0.03 sec.

Code:
g:\projects\cliplines\release> timer & dir s:\ > clip: & timer /s & cliplines.exe & timer
Timer 1 on: 00:11:37
Timer 1  Elapsed: 0:00:00.16
2930
Timer 1 off: 00:11:38  Elapsed: 0:00:00.19

It doesn't seem right that @LINES takes roughly 10,000 times as long! If it must be that way, I'll write a _CLIPLINES plugin variable.

More investigation suggests there must be something wrong. It takes .18 sec to find and print the 2929th line of the clipboard and over 6 minutes to count the 2930 lines in the clipboard!
 
Going back to the issue that started this thread. I just realized we all overlooked the internal variables introduced in V13: _pdir_dirs and _pdir_files. With those, one could use PDIR with an empty reporting field, and just use these variables. This should be quite rapid. For example, to count subdirectories of %D:

pdir /() /s /a:d %D
echo %_pdir_dirs

Note that one can choose items to be counted by their attributes, including date and size ranges, limit directory recursion with /Sn, start at a specific recursion level using /S+n, limit tracing into junctions/symlinks via /Nj, etc. - all the freedoms of PDIR yet relatively fast because no output is generated. And note that by default PDIR excludes header, trailer, and hides dots.
 
really :cool:.
I totally missed these 3 variables.
 
Going back to the issue that started this thread. I just realized we all overlooked the internal variables introduced in V13: _pdir_dirs and _pdir_files.
I think Dan, the OP, is still using V12 so doesn't have the new variables.
 
Well, this may give him the incentive to upgrade, though he may not be able to - he reported in another thread 98% and 99% disk usage on his drives...
 
And since my entire income is from Social Security Disability, which ain't all that much there wasn't any "compelling" need that I saw to upgrade to V13. However, I recently did some major "rearrangements" re. my financial circumstances (I literally had no choice but to do something because I was literally running out of enough money to eat at the end of the month) by "getting rid of" a number of financial "obligations" I realized I didn't really need in my life (and no, I didn't stop paying anybody I owed money to) absolutely none of which I've really missed since and the only "problem" I have now is that I seem to have been more "successful" at doing that than I would have expected in my wildest dreams which leaves me with the distinct feeling that I must be doing something wrong somewhere! (I'm not quite sure :) is appropriate, but I do have a sense of humor about it!).

And Frank, in another small bit of humor there is so much "stuff" in TCC that I wonder if anybody (even Rex himself! :)) can keep track of it all!

- Dan
 
... in another small bit of humor there is so much "stuff" in TCC that I wonder if anybody (even Rex himself! :)) can keep track of it all!
When decades ago the term "superprogrammer" was in vouge, referring to people who develop programs ten times faster than the average programmer, I quipped "A superprogrammer is one who can keep a view of the whole forest in mind while drawing the veins of a single leaf" (cf: "losing sight of the forest while looking at a tree"). From the speed with which Rex can correct the occasional bugs, he must qualify as a superprogrammer even under my definition.
--
Steve
 
... From the speed with which Rex can correct the occasional bugs, he must qualify as a superprogrammer even under my definition.
--
Steve
except in the case "DIR /S0" :D
 
Steve, I had a number of things I wanted say on this subject, but also for a number reasons (which you would have probably immediately understood if you had gone there) I wanted to start a "conversation" with you (assuming you were interested in going there, of course!) with my responses to the above which I didn't really want to make totally "public" because it would have dealt with "issues" I would rather remain at least somewhat "private". (This place is certainly not "private"!!!!) However, I can no longer figure out how to start a "conversation" with anyone on this bulletin board! (Blindness or bad memory in some way? Or does that "capability" no longer exist? Really don't know, and I've spent a fair amount of time "looking" already. :() - Dan
 

Similar threads

Back
Top