bug _ypixels in BDEBUGGER

May 22, 2013
34
0
The follwing simple btm file displays the problem.

@echo off
echo %_ypixels
quit

Let we say, the screen resolution is 1920 x 1080. On excution, the batch will display: 1080

Now, on windows 10, click right on screen and select "Display setting". Under "Scale and Layout", Change te size to 125%
Again execute the simple script. Now it will display 864 (1080/125*100) as expected.

But now runt the script in the debugger. It will display the wrong value 1080!
In other words, TCC correctly sees the scaling, but the debugger does not. No problem once you know what happens. But many laptops are set to 125% by default.

GRTNX
Ruud Uphoff
 
May 20, 2008
10,775
83
Syracuse, NY, USA
I think the debugger gets it right. The number of pixels is a physical characteristic of the monitor (isn't it?). I don't think it changes when you make windows, icons, and text bigger; the desktop stays the same size.
 
May 22, 2013
34
0
I think the debugger gets it right. The number of pixels is a physical characteristic of the monitor (isn't it?). I don't think it changes when you make windows, icons, and text bigger; the desktop stays the same size.

TCC corectly detects a screen size of 1536 x 864, if that's wrong, al scripts I wrote between 2010 or ealier, are illegal correct working, because TCC in error detects the right size of _pixels.

For good understanding, the bug results in scrips using _xpixels or _ypixels, that normally work fine, no longer work when running in bdebugger if the size of the display, differs from the resolution

Regardless any opinion, it is a bug.

Kind regards,
Ruud Uphoff
 

rconn

Administrator
Staff member
May 14, 2008
12,010
136
It's a Windows issue, not TCC or the debugger.

%_ypixels simply returns the result of the Windows API GetSystemMetrics( SM_CYSCREEN ). The reason for the difference is that when you're running in TCC (a console app), you get the scaled value back from Windows. When you're in the debugger, you're not running TCC -- you're running a GUI app that's using the TCC engine (TakeCmd.dll). And when you're in a GUI app and you make that call, you do *not* get the scaled value.

Why Microsoft implemented it this way (feature? bug?) is something only they can answer.
 
May 22, 2013
34
0
It's a Windows issue, not TCC or the debugger.

%_ypixels simply returns the result of the Windows API GetSystemMetrics( SM_CYSCREEN ). The reason for the difference is that when you're running in TCC (a console app), you get the scaled value back from Windows. When you're in the debugger, you're not running TCC -- you're running a GUI app that's using the TCC engine (TakeCmd.dll). And when you're in a GUI app and you make that call, you do *not* get the scaled value.

Why Microsoft implemented it this way (feature? bug?) is something only they can answer.
Thanks,

Aha.. Of course, this explains the difference.
It is not a real problem. Just set the screen to 100% while debugging. I just was wondering.

Kind regards,
Ruud