• Welcome to Autism Forums, a friendly forum to discuss Aspergers Syndrome, Autism, High Functioning Autism and related conditions.

    Your voice is missing! You will need to register to get access to the following site features:
    • Reply to discussions and create your own threads.
    • Our modern chat room. No add-ons or extensions required, just login and start chatting!
    • Private Member only forums for more serious discussions that you may wish to not have guests or search engines access to.
    • Your very own blog. Write about anything you like on your own individual blog.

    We hope to see you as a part of our community soon! Please also check us out @ https://www.twitter.com/aspiescentral

alt+tab

thejuice

Well-Known Member
V.I.P Member
Any tech heads know why sometimes i can alt+tab out of a game some times and other times this feature is locked within the same game. (alt+tab to minimise game window)
 
Presumably you mean in Windoze?
When I used to write software for windows the core of the program was a loop that collected messages from the windows message queue. these messages can be all sorts of things, they may be to tell all apps the default printer has changed, or they may be the input from the mouse and keyboard, etc.
The program can choose to process those messages and discard them, or to leave for the OS to handle, and a program (game) could choose to look for the alt-tab keypress and simply discard it from the queue, thus preventing the OS from acting on it, or choose not to call the api that would initiate the alt-tab process, or even just pop up a dialog to say "Stop pressing Alt-Tab in this game!" 😉.
Take with a pinch of salt but as I recall (this was decades back) this is the most likely reason, though any more up to date coders may have better info.
FYI: This is coding in low level languages like C and C++ which is far more likely in many games for performance reasons.
 
I have the same issue with some games and I'm not using Windows. I run Linux and I play both Windows and Linux games. Even some Linux games don't allow you to Alt-Tab out of them which can be annoying if messages come in while you're playing.

Interesting note on the side, there's quite a few Windows games that have dire warnings about using Alt-Tab even though they allow you to do it, and in Linux that seems to work flawlessly and cause no problems. One of those games is No Man's Sky which is quite heavy on the graphics. In that game there is no pause, it's designed to be played in real time, but using Alt-Tab pauses the game. :)
 
Hmmm, I wonder, could that just possibly be because Windows is a bit rubbish at playing games?
Was never designed to do so, the graphics interface was only ever meant for WIMPS (the desktop systems, not people who are frightened to use them! 😄).
DirectX was basically a cheat that drilled through the windows system to hit the hardware efficiently (well, more efficiently), and never was really a well designed and integrated part of windows.
I've never coded Linux though so couldn't say how it handles these things.

Fallout 4 does similar to No Man's Sky, if I alt-tab or Win-tab out of it, it appears to send an esc keypress to go to the game menu (or trigger it manually in code). On return to game, another esc returns to the play.
 
Alt-Tab is simply a command to switch from one window to the next when you have several programs open. If you keep holding the Alt key down and only tap the Tab key you should see a menu on your screen showing all the programs that are open. While still holding the Alt key down every time you tap the Tab key it will highlight a different window in the list, when you let go of the Alt key the highlighted window will jump to the front. That is it's specific purpose.

There shouldn't be any real problem in using it at any time except for the way in which Windows handles graphics and memory, so with some games this will cause the game to crash. In some games where Alt-Tab isn't an available option it's just as likely that the programmers didn't bother to include the code for that as it is to be that it might cause problems.
 
So it alt tab something youre not really meant to do?
Yeah, you're supposed to alt-tab. @Boogs has the technical answer. I will try and simplify it.

Think of the operating system as like a street intersection and the programs are the cars. Sometimes cars ignore what the road rules are and blast through stop signs.

Programmers can and do ignore street signs (speed limit, lights, etc) when writing programs. So that's why alt-tab sometimes doesn't work. Maybe they want you to get addicted to the game and not stop playing it!
 
The car analogy works for some things, but not that well for this.

There are many layers of software in a computer (or smartphone). Far too many to explain in a post. But keyboards might be possible.

A keyboard connects to a computer in a way that's most easily modelled by a network connection. The keyboard has a small logic chip that "knows" a protocol (mostly USB these days), and can announce itself as a keyboard.
Each key generates a unique code that's carried to the system over this protocol.

There's a layer of hardware stuff that's out of scope, then a very low level program will be started and will begin reading the keycodes off the "inner side" of the USB layer. These are sometimes called "drive(r) routines" and similar names. They're not supposed to miss any data from the external device (our Keyboard in this case). This is easy with Keyboards, which are very slow. Mice send many more signals (fewer unique codes though).

Drivers have no idea which program "owns" the keyboard at the moment. That's part of what the main operating system handles (program start/stop and context switching, memory management, for Windows run the "Desktop" GUI, ... and interaction with the hardware layers via variations on "drive routines").

At this point there's a lower level of the operating system that can do whatever it wants to with incoming signals and data (keyboard key pressed, mouse moved, block of data from a disk delivered, etc).

It makes sense for there to be a few interrupts that can get through all the applications and do things to:
* The hardware (e.g. CNTL-ALT-DEL, and at least one weird sequence to get to the "BIOS"( which is part of the H/W, not the operating system)).
* The operating system, "behind the back" of whatever application has priority in the higher levels of program management

The last one is what this question is about.
If you had no way to stop a program, or switch to another one, an application could take over the system, safe from anything except some autonomic OS interaction or the box's power switch. This does happen occasionally, and can be a rage-inducing experience :)

So the operating system provides ways to tell it to do something that overrides the active application in small ways. Such as switch to a different program.

Some of them are keyboard codes. The OS does this by making a decision right at the "inner side" of the K/B drive routine - some keycodes aren't available to normal applications, because the OS grabs them for its own purposes.

But ....

... this is implemented by a low-level API in the OS. Something normal coders can't always get it, depending on the coding stack (e.g. AFAIK standard Java can't do this, though Java can call C code that can).
But it's still just an API, and people who know what to do with it can hook some of these special keycode interrupts themselves.

Many game coders have the necessary skills and toolkit. Not all game coders, but any "big" game will have access to such people.

I'm not interested in the smaller details of this, but it's certainly true that the OS could reserve some special keycodes to itself (i.e. not readily available to even technical coders).

But the question proves that Alt-Tab can be "taken away" from the OS by a user program.. This isn't necessarily a bad thing, but that's another discussion.

It's certainly the case that some applications react badly to being suspended by something external to the program, and instead want to save their internal state before they give up control. A multi-player RTS is an example of a candidate for this, but I don't know for sure.

BTW there's always a tech fix to resolve such an issue, but if it's crazy expensive compared to its value, it won't be used.

PS: Windows and Linux are very different.
Windows is very bloated in comparison - the Desktop being closely integrated with the OS is an example of the mixed +/- effects of that. And also an area I mostly avoided here because it would require a bunch of mostly irrelevant extra text and/or things I don't understand well enough to explain clearly (or at all :)

So a principle: if you have finite time or energy to talk about how IT stuff actually works, there are always layers above and below what's actually discussed. Even PCs, game consoles, and smartphones are very complex.
 
Just to be really really annoying and such! (my little pleasure in life)...

Also, much of the above is quite right in it's descriptions, though some small discrepancies from a nerdy level beyond which any normal healthy human would want to bother with these days! (hence why I luv it! gruesome for the sake of it! 😄).

The BIOS (Basic Input Output System) is actually not hardware at all! It's software stored in a hardware rom chip (read only memory) on the motherboard.
It's called the bios because it contains a set of routines that allow software running in memory to access the hardware on the motherboards bus (k/board, screen, disks, etc). It comprises a set of functions that can be called from a low level language such as assembler or C; in C you use the Int86() (if I recall right, this was late 80's I was doing this stuff) function to do things like reading and writing to the disk at a very low level, basically selecting raw sectors to read and write to, no disk operating system or suchlike - great for digging into things like the FAT table and similar, or searching for deleted files etc. Lotsa fun to be had back in DOS days. This is what the lowest level device drivers would use to connect and access the storage. Other stuff in there like reading the k/board queue (back before USB existed, anyone remember PS2 connections?), now the USB drivers will do this to pull those keypresses into a windows driver and up to the OS.

In addition there's also a a similar rom chip on the motherboard called the DOS (not to confuse with MSDOS) which held similar functions but higher level and more suitable for a Disk Operating System to call to manage the FAT table and open/close read/write files etc. Also called with the Int86 function.

As for needing to use a Windows API to get the keypresses, yes, if you're playing dodgy games like writing a keylogger you may need this to sit beneath the OS's notice and steal those keys before Windows even knows, but you can also do this within the Windows OS and hook into that queue with call backs and api's etc.

But when you code in windows using C or similar without a framework library, the core of the program is simply a loop that pulls those messages windows sends to the app, and handle them, and key presses are one of the types of messages.

You'd write a main function called WinMain, like this (C++ version)...
int PASCAL WinMain(HINSTANCE hInstance, HINSTANCE hPrevInstance,
LPSTR lpszCmdLine, int nCmdShow)


(the PASCAL bit relates to how the function parameters are actually passed across - e.g. little endian or big endian if I recall, C does it differently to C++)
You'd define the window class you want your app to have (sizable, dialog, close icon, etc etc) and register it with windows. Then create your window...
hwndMain = CreateWindow("MainWndClass", "Sample",
WS_OVERLAPPEDWINDOW, CW_USEDEFAULT, CW_USEDEFAULT,
CW_USEDEFAULT, CW_USEDEFAULT, (HWND) NULL,
(HMENU) NULL, hinst, (LPVOID) NULL);


(Note a hook there to add a menu to that window: (HMENU) NULL (the null means no menu))
Do a ShowWindow and UpdateWindow (to paint it's contents on the screen), then the meat and two veg, a loop to pull those messages from the queue and process or discard them or pass back to Windows...

while( (bRet = GetMessage( &msg, NULL, 0, 0 )) != 0)
{
if (bRet == -1)
{
// handle the error and possibly exit
}
else
{
TranslateMessage(&msg);
DispatchMessage(&msg);
}
}


GetMessage pulls next message from queue, then if no error Translate and Dispatch process the message (I honestly can't remember where the users code fitted in here it's so long ago, but basically the apps code would look for a relevant keypress (or whatever) and process it - if( key == left_arrow) { move_ship to left }, that sorta thing but less crude.
I can't say about win64 but this definitely still exists in win32. When you code in something higher level like C# this still existing but is buried deep in the code framework, it's all done for you.
Eventually a WindowsExit (or whatever it's called) message will reach the loop (when someone clicks the 'x' to close it, or whatever) which closes down the app. You write code to free up memory, release resources etc etc. So much ruddy housekeeping sometimes!

So my guess is the processing of alt-tab and win-tab is handled by windows as default (DispatchMessage function I'd guess, btw the TranslateMessage will convert a key combination to a proper recognised windows message like ALT-TAB), or the programmer overrides this and writes their own code to process that keypress, or maybe to ignore it all together so alt-tab does nothing in the game (if desired).
 
@Boogs

Naturally in a discussion of technical trivia, nitpicking isn't merely acceptable, it's a moral obligation :)

I know what the BIOS is, but I'd already written enough, and didn't feel it was essential.
Nice to have it covered separately though.

I'm not sure about others, but for me, understanding how the H/W gets the operating system up and working was an important part of my early IT education.
FWIW I learned this on mainframes, at a time when the OS and the H/W were a bit more closely integrated, but already in the time of microcode, and we had pipelining. And H/W assists for hypervisors.
All of that has become better over time, but occasionally I get some amusement from pointing out that modern Hypervisors like VMWare are reimplementing pre-PC tech :)

In case you're wondering, you are (or were) a better coder than me. But I understand the "big picture" quite well.

All of my low-level coding was in mainframe assembler, and I missed out on the "C-era" entirely. Now it's all Java, which isn't as much fun as assembler nor as fast, but coding goes a good bit faster :)

I'll have to take a long look at the code you included. You may have exposed yourself to some "idiot questions" :)
 
nitpicking isn't merely acceptable, it's a moral obligation :)
Well thanks for being a sport about it! 😊 I suffer from terminal waffle, and if it's anything techy or science, watch out! there servers gunna get stressed!

I tend to go off on one, and forget some poor bugga's expecting to actually read my drivel!
But that said - it's exceptionally good quality drivel if I do say so myself (well, no-one else will, so I have to step forward and say it myself!).

I was a little later to the game than yourself I suspect, I started in a small way when a mate got a ZX81. Prior to that I thought computers would be far beyond my capabilities to understand and use. But this guys sitting in front of his '81 plugged into the telly, trying to get a small BASIC program to draw a simple square on screen. It ain't working, so I looks at this code, knowing sod all about programming, and it's really obvious! Look - there's this loop thingy, FOR NEXT wotzit, and the syntax is so obvious you can read it if it's already there (i.e. if the keywords are already there it ain't hard to work out what they are doing without a manual). And I see what it's trying to do, two loops, one to go x=1 to 10 with a plot x,y and a plot x, y+10, then second loop doing same for y, thus drawing two horizontals then two verticals to make the square. "Here!" Cries I, "lemme try!". "Aw, go on then", he says, "but it's broken, bet you can't get it to...oh, it's worked! How did you do that?".

And it was all downhill from there! BASIC was clearly a gateway drug! Far too unsatisfying to stick with, but showed the principals. Got myself an Acorn Atom (precursor to the BBC micro), and that little 2k ram beast (upgraded to a whole 12K soon after!) was a huge step forward. Not only was the basic far faster, it had pointers and thingies (indirection)! And you could even include some assembler and it would precompile on running the basic program, and you could then call the machine code from the basic code (e.g. make your own 8x8 sprites animate for a game etc). Plus, a manual showing memory locations to read and write for special functions.

Eventually I ended up hooked on C, spending all my money on compilers, mainlining code into my veins, begging on the streets for enough to buy more RAM. A sorry and sordid tale to be sure!
It was when someone sold a PC card to plug into the underside of my Amiga that I was totally finished, Turbo C was followed by Turbo C++, nights spent coding into the early hours for the sheer pleasure of it, sigh! If only the tolerance hadn't caught up with me.

I did do a little work on a VAX/VMS, but only some DCL scripting, mainly to convert instrument data from a serial port into something that Excel could work with, binning the data and such like for the researchers (drug research labs) to crunch from various compound tests. but never dug into the thing, though it was impressive. Cutler who wrote the VMS OS was poached by Microsnot to write Windows NT (thank gawd! A real OS at last from MS!).

As for being a 'better' coder, I'd hold that one in arbitration to be honest. But with the PC being the first thing I could really get to grips with, and find the tech documents to show where the interesting bits were, I was a pig in sh*t! If I got it wrong, if I didn't understand, it was only because I hadn't looked hard enough, the answers were always there to be found in the machine itself.

But it really relates to my own condition more than being real clever or anything. I can only remember semantics. Systems, schemas, basically how things work. My whole experience of everything has had to revolve around a world of black boxes, and until I can take them apart (mentally speaking) and understand what they are and how they work, I'm pretty dumb, and it's the only way I can relate to anything really. So finding something so crisply and sharply defined, with zero ambiguity, was such a pleasure to absorb for itself, it became a thing in itself rather than just needing to learn it to get by. Facts and figures and all the rest can only be recalled if I can hang them on a data structure of knowledge and process internally. So maybe it's more a reflection of my dysfunction than my ability?
But who can say in the end, maybe just that pleasure of discovery was enough in itself, that I need not care either way?

C is very much like a high level assembler language. I know the real hot guys on it could even tell what the output machine code/assembler would be! A very tight linkage, along with the ability to do much of the clever trickery in hitting those 'hidden' parts of the hardware to make the beast sit up and beg. I never had the patience for assembly language, but it taught me much about how processors work (yet another black box).

P.S.
To show how low I have have sunk now, having collapsed all my C veins, and suffered numberous compiled deep vein thrombosis's, I now, like a fine wine aficionado turned alcoholic and resorting to VP sherry, mostly churn out Powershell code to automate and manage AD and the like, oh! The shame! How the mighty have fallen! Interpreted fer gawd's sake! Not even a JIT compiler! Keerist! What's to become of me?
 
Last edited:
I'll have to take a long look at the code you included. You may have exposed yourself to some "idiot questions" :)
Ok, i have to confess, full disclosure: I nicked the code from the net (search on win32 message queuing system or similar). I remembered the basic way it worked (re: my semantic memory) but couldn't recall the actual code itself. So found a similar example to try and illustrate better. It's more boring than it looks, but this was the page I copy pasted it from...
https://www.codeproject.com/Articles/5274425/Understanding-Windows-Message-Queues-for-the-Cshar
The fact it's C++ makes little difference, in fact it uses C really (C++ is based on C with added OO keywords etc), and the main difference is things like the PASCAL keyword to affect how parameters are passed - trivial difference really.
Most of the code is just creating the window class and the window itself, it's the last loop to process the messages that's more interesting, and there are probably better examples of that else where (I was being lazy!).

Here's a snippit of actually processing the message queue...
HWND hwnd;
BOOL fDone;
MSG msg;

// Begin the operation and continue until it is complete
// or until the user clicks the mouse or presses a key.

fDone = FALSE;
while (!fDone)
{
fDone = DoLengthyOperation(); // application-defined function

// Remove any messages that may be in the queue. If the
// queue contains any mouse or keyboard
// messages, end the operation.

while (PeekMessage(&msg, hwnd, 0, 0, PM_REMOVE))
{
switch(msg.message)
{
case WM_LBUTTONDOWN: <= is the left button of the mouse pressed?
case WM_RBUTTONDOWN: <= is the right button of the mouse pressed?
case WM_KEYDOWN: <= or has a key been pressed?
//
// Perform any required cleanup.
//
fDone = TRUE;
}
}

}
 
Last edited:
You know, churning all that out has been remarkably therapeutic! Thanks for giving me an excuse! 🤣
 

New Threads

Top Bottom