TCMD Prompt [email protected]

May 9, 2013
40
0
#1
Using the TCMD Prompt Here context menu gives me an instance of TCMD with a single TCC tab instance which DOES NOT HAVE THE FOLDER FROM THE context menu it was launched from.

I verified that the registry is set to launch TCMD using /D "%L".
I verified that this actually happens using Process Explorer (%L is expanded properly).

But I cannot figure out how to make TCMD start my tab with that folder as its current working path.

What is the correct method for doing so? (I just did a get latest - and updated - and it continues to fail to work properly).
 
#2
Try it from the start button ... [path]\tcmd /d folder. (probably the same result)

Are there any tabs defined in TCMD\Options\Tabs (directories specified there will override TCMD's startup directory)?

Does TCMD's "COMSPEC" (Options\Tabs) have a "/D" startup option for TCC?

Does your TCSTART file set TCC's current directory?
 
May 9, 2013
40
0
#3
No tabs defined.
No COMSPEC defined.
The start button... WORKS! :)

I know for a fact that the argument is being passed in from the windows shell - since I can see it in Process Explorer. It is quoted properly. There are no other extraneous arguments...

... :mad: ... figured it out.

If there is a hidden folder in the path, then this fails. :yuck:

BUG REPORT: TCMD /D "path" should not fail just because one folder or other is hidden.
 
May 9, 2013
40
0
#6
When I un-hid ProgramData, my path started working correctly C:\ProgramData\Cimex Corporation\CimPACK\15.0\Local\Temp
When I re-hid ProgramData Win+R started in C:\Program Files\JP Software\Take Command x64 15.01
When it fails from shell context menu, it starts in C:\Windows\System32

It is probably some mechanism for default folder as generated by the launching code, and is simply not reset to the /D directive by TCMD if that CDD internally fails (due in my case to the folder being hidden).

Still definitely a bug in TCMD. Maybe you would see failure if you were trying to access C:\Hidden\Some Folder?

Note: using the CDD command in a TCC prompt for C:\ProgramData\Cimex Corporation\CimPACK\15.0\Local\Temp does work fine. It's just the /D "C:\ProgramData\Cimex Corporation\CimPACK\15.0\Local\Temp" that fails... (or any subpath to C:\ProgramData)
 
May 9, 2013
40
0
#10
Yeah. I would LOVE to hear from someone who was involved in making the x64 decisions as to what drugs, exactly, they were on. Or, if no drugs, then what the justification was behind the goofy choices that were made when 64 bit was added to the XP code base.

Renaming the existing Program Files to Program Files (x86)?! Why? Why not just have one folder, and let the actual software publishers decide for themselves whether to use the same folder or a different one on a per-product basis (our code base would have happily lived in Program Files\Our Company\, both x64 and x86.).

Virtual redirection in the registry. Same with filesystem. None of it makes any sense to me. Just let the devs who want to make x64 based software be responsible for making things work, rather than breaking lots of existing x86 software for no apparent gain.
 
#11
Yeah. I would LOVE to hear from someone who was involved in making the x64 decisions as to what drugs, exactly, they were on. Or, if no drugs, then what the justification was behind the goofy choices that were made when 64 bit was added to the XP code base.

... None of it makes any sense to me. Just let the devs who want to make x64 based software be responsible for making things work, rather than breaking lots of existing x86 software for no apparent gain.
(emphasis added)

The purpose is to make billions of customers buy new (well, not really new, just recompiled or relinked) software so they could continue to do what they had been doing for years. A new antimonopoly suit, perhaps?
 
May 9, 2013
40
0
#12
Early Windows decisions were made intelligently - with the technical difficulties steering many of the stranger ones. So I hesitate to say that it's avarice... but I'm flummoxed as to what technical reason could possibly be behind those decisions. They're just too dumb, too problem-causing (as oppressed to solving). I cannot think of any valid reason behind any of them. They just strike me as some of the worst most foolish technical choices of Windows history.
 
#13
Early Windows decisions were made intelligently - with the technical difficulties steering many of the stranger ones. So I hesitate to say that it's avarice... but I'm flummoxed as to what technical reason could possibly be behind those decisions. They're just too dumb, too problem-causing (as oppressed to solving). I cannot think of any valid reason behind any of them. They just strike me as some of the worst most foolish technical choices of Windows history.
Yes, but consider the size and complexity of the product today vs. 30 years ago. And consider that as you go back in time there were fewer expectations, legacy applications, and legacy behaviors; in the beginning there were none. I suspect that once upon a time there were a small number of persons collectively capable of understanding the whole product and fairly capable of coordinating its development. I suspect the numbers have grown. And I suspect the cababilities have diminished (in part, at least, due to the sheer size and numbers).

30 years ago a set of 5-6 of floppy disks (Windows) was an intimidating thing. Today I expect that much data to arrive at my computer in a few seconds. If TCMD were distributed on floppy disk, I imagine it would require 15-20 of them.
 
#14
IBM has taken great care to allow customers' legacy software to run on newer machines, to the extent that whole instruction sets and machine architectures were emulated, even so far - IIRC - that the 7080's emulator of the 650 was included in the 360's emulator of the 7080, so you could run 650 programs on the 360. And in designing the 360 in the early 1960-s they made the width of the memory bus transparent to the software - it could be 8b, 16b, or 32b depending on what you could pay for, and later (in the 370s) expanded to 64b and 128b, maybe even 256b. Workload increased - get the faster machine, use the same software. And the POSIX (Unix, Linux, BSD, AIX, Xenix, etc.) architecture is also flexible - your old software will always run. Vince, you use sed and other *nix utilities - only the Windows version needed to be upgraded, the old linux versions still work today... And how much of the new features of Win7 are intended for the personal computer? Think of SETI running on MS-DOS.
 
Jun 2, 2008
300
1
#15
Early Windows decisions were made intelligently - with the technical difficulties steering many of the stranger ones. So I hesitate to say that it's avarice... but I'm flummoxed as to what technical reason could possibly be behind those decisions. They're just too dumb, too problem-causing (as oppressed to solving). I cannot think of any valid reason behind any of them. They just strike me as some of the worst most foolish technical choices of Windows history.
Avarice... I'd never heard of that word before, so I looked it up and it's a nicer way of putting EXACTLY what I was thinking.

When a product matures and they can't figure out how to add any more useful features to keep it attractive, they start thinking of other ways of making money (they are a BUSINESS after all) and convolution occurs. Moved (I'll lump drastic interface changes in with this) or completely removed features is one. Removed once-very-helpful context sensitive help is another. All of this convolution spurs support calls that they make you pay for. And to "fix" the problems they caused, hey, you can just buy their next (even more convoluted) forced "upgrade".

Of course all of this is just a theory.. :) but I wouldn't be saying it if I hadn't seen it happen time and time again.