A minor but annoying issue with the "MD" command...

  • This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn more.
The following will illustrate exactly:
[S:\]Timer On & MD "A Directory" /D & Timer Off
Timer 1 on: 16:19:56
Timer 1 off: 16:22:35  Elapsed: 0:02:38.24
[S:\A Directory]
I really doubt that this makes any difference, but "S:" is a drive letter that is subst'd to a directory on Z: (my RAM drive).

And while this is the first and only one I've ever actually timed, the time elapsed is pretty close to what I believe the average is. Why? (And exactly the same thing done from the root of my D: drive (a partition on a real, physical hard disk) took 0:01:12.45, somewhat less time but I don't know if that's particularly relevant; and doing the same thing to a drive letter subst'd to a directory on the D: drive took 0:01:23.01, so maybe subst has something to do with it.)

- Dan


Staff member
May 14, 2008
As Charles implied, the /N option can take a second or two if you have a very large extended directory search database (jpstree.idx). (MD without a /N will insert the new directory at the appropriate place in jpstree.idx.)

The /D option only takes a few milliseconds. /C can take a little while, but that's dependent on Windows.
Well Charles, that was a very good idea (I don't ever use cmd.exe and haven't for probably about 20 years so the thought would have never occurred to me) so I tried it, and while I don't know of any way to actually time it under cmd.exe, the directory was effectively pretty much created instantaneously from my point of view. So this is a strictly a TCC issue of some kind. (Something to do with that "database" TCC keeps containing information on directories?)

And Rex, I was entering this at the same time you were entering your posting so I hadn't seen it. And it definitely is related to the index database because "/N" changed the "create" time to .03 seconds on my Z: drive, a drive letter that is subst'd to J: which is the real drive letter of my RAN disk. However, the "RD" still took 0:01:11.74 (because it is still looking in the database? I don't see any way to turn that off if so). Quick question: Is there any easy way (other than creating an alias) I can make "/N" the default?

- Dan

P. S. I just did it again on the drive letter that my RAM disk really is ("J:"), and it took 0:01:06:33 to create a directory on that drive in TCC, and 0:01:12.78 to delete it. And I just did it again on my H: drive, a partition on a real, physical, disk on my laptop, and it took 0:01:20.59 to create a directory and 0:01:33.63 to delete it. These two things pretty much prove that it has little, if anything, to do with a RAM disk (which is otherwise very fast, one of the reasons I use it). And it took 0:03:41.25 to create a directory on my Z: disk (subst'd to "J:") and 0:01:16:26 to delete a directory on the same disk, so maybe subst'ing has something to do with it (kind of inconsistent results between "MD" and "RD"), although it is far from being the only cause.

Charles Dye

Super Moderator
Staff member
May 20, 2008
Albuquerque, NM
Did you try MD /N?

If it turns out that is the issue, and you can live without the database, then that would be an easy way to free some disk space. (At a minute-plus to update, it must be ginormous!) Alternatively, if you must have the database, then you might consider moving it to your RAMdisk at startup. (Of course, you'd have to copy it back to real storage on shutdown.)


Staff member
May 14, 2008
You either have the largest jpstree.idx ever created, or you've got one that's hopelessly mangled (or circular?). I have 47,000 directories in mine and it takes about a second to update.

Try deleting it and recreating it (with CDD /S).

The whole point of aliases is to make default commands; there is no other way to do it (or any reason to use another way if there was!).

And yes, RD also updates jpstree.idx.
I suspect that not only is your jpstree.idx large, but due to the 99.9% use of your C: drive, it is also broken into hundreds of fragments, so updating it takes forever. Try to defragment all your real drives, it should speed up ALL operations.
Oh yes, another point - when you create directories on your substituted drive, each directory needs TWO entries in the index - one with the real name, another with the substitute name. I use SUBST on a permanent basis only on my laptop, so it would imitate my desktop better - my desktop work drive is F:, so I have many aliases and variables defined for F:. Cf. my laptop has only a single volume, C:, so making C:\ also F:\ provides better portability. The only other time I use SUBST is when I want to use some old DOS program that assumes a full path cannot exceed 63 characters, so that SUBST to a lower level directory shortens the path names. This is done AD HOC - temporary on the rare occasion I need it.
Thanks, guys!

First on the subject of "CDD /D", it says in the docs that it removes the specified and drives or directory trees so I tried "CDD /S *" hoping that meant that it would remove everything and it took 4 or 5 minutes as a guess (I didn't "formally" time it; maybe I should have). However, after doing that deleting an existing (empty) directory on my Z: drive took 4:53.71 and then recreating that same directory took 1:18:48, not good but not terrible, either (although the "RD" really was terrible!). So then I found the jpstree.idx database in C:\Users\DanTheMan\AppData\Local\JPSoft and I renamed it to "jpstree-1.idx" (I was a little bit surprised that it allowed me to do that) and created a new, empty, jpstree.idx (">jpstree.idx") and deleting an existing directory took 0:00:00.01 and re-creating that same directory took 0:00:00.00(!!!) so the problem is solved (at least for now). However, I am a little surprised in that the previously-existing jpstree.idx database was only 4,224,688 bytes long, which I don't consider to be particularly large.

And, Steve, I use "subst" almost constantly because when I am working on any give "project" that has multiple files associated with it I "subst" the directory containing that project to a drive letter, and if I don't remember what drive letter it is (not all that unusual given my bad memory, of course) I use a "subst" command with no parameters to see which drive letter it is. At this minute I've got T: U: V: W: and Y: (not X: at the moment) subst'd, and Z: is always subst'd to J:. (Because of my bad memory I use very long and very "descriptive" directory (and file) names.)

Rex, as far as defragmenting goes, I have the standard Windows "Disk Defragmenter" set to automatically run every Wednesday at 1:00 AM (I am virtually always awake at 1:00AM on Wednesday mornings and therefore this computer is virtually always on at 1:00 AM Wednesday mornings). And as of this minute it reports that my C: drive is 3% fragmented and my D: drive is 8% fragmented, not too bad I would say.

And, Rex, I am not using compression on my hard drive(s). I tried that as a test several years ago, and I really didn't like something about it but I no longer remember exactly what that "something" was (no real surprise there) and I abandoned the whole idea. I thought an external hard drive (which I have) was a better solution in the long run. And I am using NTFS on all of my drives, even my RAM disk (it initially comes up formatted with some "version" of FAT, but I immediately reformatted it to NTFS. And I only had to do this once because it saves a disk-image of itself to a physical hard-disk (always to the same file so fragmentation is a non-issue) on system shutdown and reloads itself on system startup. It also "saves" itself to that file on a user-selectable schedule.

BTW, as kind of an ironic aside and a bit of a warning re. my (previous!) 1TB Iomega external hard drive: after it died (I lost a lot (about 250GB as I recall) of stuff that I really didn't need although the reason it was on that drive in the first place was because I thought that someday I might want/need some of it (a very good example of that is that I keep all "versions" of a program from the very beginning and I had placed all "previous" versions of everything I have ever written (almost 75 different programs) on that hard disk (96.8 and 98% full for the internal partitions of this laptop is the very simple reason why) and I thought that maybe I could get it replaced under warranty (I could not because it was out of warranty according to the Iomega website) and there was a phone number on the website that I understood was "help" of some kind if your drive was out of warranty. Well that wasn't what that telephone number was for (and I haven't quite figured out what the phone number was for), but at any rate the man I talked to at that phone number asked me what I was "complaining" about because those drives were only expected to "last" about 2 years!!!!! I somehow doubt that that is well-known. As soon as I get the funds (and a hard drive like the 2TB external hard drive becomes available - the other day the one I have was listed as being NOT AVAILABLE on the website of the vendor from which I had bought it) I will get a 2nd one - a "backup" for the "backup".

Again, thank you both.

- Dan
You should definitely look to see what is in your jpstree.idx. You can use the TreeExclude environment variable to prevent folders from being indexed. I do "cdd /nj /s" to build the index. I think without the /nj, it will follow circular references that Windows has and make a huge file (or at least it used to do this).
Dave, my current TreeExclude (based on your suggestion ) is "A:;B:;C:;D:;E:;F:;G:;H:;I:;J:;K:;L:;M:;N:;O:;P:;Q:;R:;S:;T:;U:;V:;W:;X:;Y:;Z:" (i.e., everything). And Rex, while I won't at all argue that that file can be helpful for many if not most people, for me (maybe because of my disabilities) its just a pain.
Dave, my current TreeExclude (based on your suggestion )
I didn't suggest that. If you don't want to use the extended directory search, then turn it off in Options > Command Line > Extended Directory Search (set Search Level to 0), and delete the file.

Under normal conditions the file doesn't slow things down noticeably. As Rex said, your file is probably huge, mangled, and/or circular. Doing a "cdd /nj /s" may (probably will) fix things.
And just a final note, Rex, I use aliases a lot (I've got 56 of them defined as of this minute - and this is one of the "features" of TCC that I brag about the most), and not only do I use them to avoid to having to put the paths to many apps actually in my path and effectively "rename" an app ("Wo*rd" is defined as "C:\PROGRA~1\MICROS~1\OFFICE11\WINWORD.EXE", for example), I also use them to effectively "redefine" the way some internal commands actually work by having the alias invoke a batch file (example: Dir=D:\DOS\DirPlus %$) in which I can change the "behavior" of the internal command as I wish.

More detail if you are curious/interested: In the example of the "Dir" command above, "/H" is effectively always specified (although you can change that behavior by actually specifying "/H" which effectively has the opposite meaning from that of the native "Dir" command), and when "/S" is not specified, the "DirPlus" batch file supplies "/K" and "/M" as well as "/A:-D" to the "native" dir command by default, orders the file by date (if there are going to be more than 14 files listed in the output it invokes the "list" command with the newest file at the top, if there are going to be 14 files or less it writes it output directly to standard out with the newest file at the bottom, and there is a "/Li[st]" option to force it to use the list command, and a "/NoL[ist]" option to force it to write its output directly to standard out. And every "default" behavior can be modified by specifying exactly the same parameters as you would on the "native" dir command (although in some cases, such as "/H", the option has the opposite meaning). And if the "/S" option is specified (and "/(" is not), the output is exactly what would be produced by the "native" dir command except for "/H" being the default. And I won't go into detail here other than to say that in some cases (such as when you explicitly use the "/(" option)) it invokes the native "PDir" command instead of the native "Dir" command. And the reason I wrote this was so I didn't have to enter "Dir /K /M /H /A-D /O-D | List" over and over and over in any given day. And I truly thank you for giving me this much flexibility, and your ability to anticipate needs (such as allowing aliases to override internal commands) is truly phenomenal.
And Dave, thank you for telling me about "Options > Command Line > Extended Directory Search", I thought there might be such an option but I either couldn't find it (a combination of bad eyes and not knowing which "option" tab it would be on) or my (I'll admit it! :)) stupidity in not knowing what that option represented. (And honestly that name really doesn't tell me what it is even now, but that could just be me, Rex.) As far as "TreeExclude" goes, from your posting "You can use the TreeExclude environment variable to prevent folders from being indexed.", and all I did was "extend" that a little bit by specifying whole drives rather than just individual folders. (Since I use "subst" a lot, as has been mentioned previously, there is no way I could exclude specific folders on specific drives anyway, and if I'm excluding drive letters based on the fact that they are subst'd drives, I might as well extend it to every drive while I'm at it. (As you seem to have surmised, I absolutely never want directory indexing.)

- Dan
David, I agree that FuzzyCD capabilities is one of TCC's greatest features. Just shows how everyone uses the tools differently (I'm a FuzzyCD 1 guy myself). I'm glad it's Rex and not I that has to try to make everyone happy....

Great idea Scott. I've not noticed UNC paths, but I'm sure they are there. I wasn't sure how the treeexclude would handle UNC, but that makes sense. Already added it.

David, because you're not me. I am half-blind and clumsy, and 95% of the time when it makes a "guess" it is wrong. And two things that together almost totally obviate any want or need I may have: The tab key for file and directory name completion (as in "np *pd*in[tab]", which comes up for me at moment in the root directory of my Z: drive, "np "D-Drive PDir Sorted in Reverse Order by INODE, Number of Fields in Path, Length of Path, and Allocated Size.txt" the "np" being an alias I have for NotePad, my most commonly-used editor, and the file name being an indication of exactly what I mean by "fully-descriptive" file names because of my bad memory. (I literally could not survive in an 8.3 world; and yes, there are far more powerful editors out there, but all of the ones I've tried in the past require me to remember things in order to use them effectively, and that's just basically out of the question for me.) And the other thing that has been much discussed in different contexts: I use "subst" a lot. This means that I have a drive letter assigned to literally every directory I want to go to on a "regular" basis at any given point in time. And remembering which drive letter a particular directory is subst'd to? Easy. Subst with no parameters, and I probably do that dozens of times a day. And the absolute "fluidity" of my subst'd drive letters means I absolutely do not want to "index" them, period, end of sentence. The given set of subst'd drive letters can change literally dozens of times in any single day. And I am virtually never in a directory in a non-subst'd drive letter other than my Z: drive, and that being a RAM disk, means that it is very "fluid" so any directory-index information for it would probably not be accurate for more than a day or two at most and I don't need the constant "updating" of that database because every directory I go to on a regular basis (literally every directory at the root of my Z: drive) has a drive letter assigned to it.

As I said, I'm not you.

- Dan