VBGamer |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
RE: That would be VERY evil! PyroBoy (6 replies, 0 views) (2000-Sep-10) That just plain sucks... :-)
So just because a display setting is enumerated by GetDisplayModesEnum, that doesn't mean it'll work with the frame buffers in video memory? Damn.
So how is my program supposed to determine what screen modes it's able to run in?(And by "run", I mean "run in video memory". The speed hit is bad for sysmem 2D and it definately won't do for Direct3D) Try every mode returned by GetDisplayModesEnum in turn, setting up primary/back/possibly Z buffers on each, checking for errors? Seems rather crude... Is there a way to tell GetDisplayModesEnum that you only want to enumerate modes where the primary surface and an equally big backbuffer are both in video memory? If so, how do you tell it that there will also be a Z buffer for Direct3D, and to only enumerate modes that will also fit that?
I'd also be interested in finding out how windows interacts with DirectDraw. I've been told that the GDI keeps some video memory (amounts depend on what res your desktop was in when the app was run). That's why you can FlipToGDISurface and have your previous windows graphics come back up. Is there a way to kick the GDI back out of video memory while your app is running? Like, by obtaining the handle to the GDISurface and setting it to Nothing? Or will that crash and burn the system when the game is over and windows tries to start using the GDI again?
|