I sure hope to receive help on this, as I'm going crazy over it.
And if the thread's title ringed a bell for you, you can probably help me.
There is no Present() method exposed by IDirect3DDevice7. That much I know.
Instead there are DirectDraw's Flip() and Blt() methods (let's ignore BltFast() ).
They are available today still... but today you naturally want to use Present().
How about the past, when DX7 was the next best thing?
My dilemma is: would a Direct3D7 game preferrably use Flip() or Blt()?
Because the real question is: how do I understand if a D3D7 game is using Flip() or Blt() to 'present' a new frame on screen?
If I were the programmer I'd use Flip()... but Flip() doesn't work in windowed mode. And I have a few D3D7 games that'll switch between Windowed / FullScreen at the pressing of the Alt+Enter keycombo. Meaning that at some point they resort to Blt(). Which makes me wonder: are we sure they don't Blt() in fullscreen as well? For code simplicity's sake it'd make more sense to stick to Blt() all the way, instead of switching to Flip(). Image tearing wouldn't be a concern since you could detect vertical blank and Blt() during that period.
To confuse me more there are Microsoft's SDK samples, which happily use Blt() without even caring to sync with the monitor (but I know those samples are often written poorly). Add that my display driver has a weird implementation of the DD's WaitForVerticalBlank() method, which uses polling instead of an interrupt. I bet that's why many DX7 games I have score insane FPS amounts but without image tearing. They must be updating and presenting dozens of frames during a single VBlank period, at every VBlank. That'd explain why my gfx card works like crazy and overheats in a minute despite the 'poor' graphics shown on screen.
Could someone with past experience in DX7 answer my bold question, please?
It's a nontrivial amount of work, but you could use DLL injection to actually see what D3D calls the game is making.
Good Reedbeta, reading your answer makes me realize I haven't described the problem with sufficient clarity.
My fault. Let me try again.
I am working on the very injection you speak of. Nontrivial indeed - but I have it covered by now.
DX9 injection is done already.
Time to do it with other versions, starting with DX7.
(Now stop me if I say idiocies - I'm so green with DX7 that I may well be ignoring something big)
The assumption: With DX7 both Flip() and Blt() can be okay to show the new frame on screen.
That is, they can both mimic the Present() method appeared in DX8 and later versions.
Case A: when a Flip() call is detected. When Flip() is called, the game is undoubtedly 'presenting' the new frame on screen, regardless of how many calls to Blt() may (or may not) have been made before. This is the best case scenario I can hope for.
Case B: when NO Flip() call is detected. But if Flip() is never called, the problem gets thorny. Because Blt() is the only alternative, and Blt() may be called several times during the normal composition of 1 frame - and it is not guaranteed that there will only be 1 call to repaint the whole buffer (that is, to examine the Blt() parameters may not be a sure winner). So: how can I detect which such Blt() call is the last, before the game starts working on the next frame?
In simpler words: How can I trap the DX7 call that is being the equivalent of DX8's and DX9's Present() method?
Others have done it before me. There has to be a way.
Ahh, I see. Hmm, in that case the final Blt() call should be to the front buffer, correct? And the game should touch the front buffer only once per frame (that's the whole idea of double buffering). So if you can tell where the front buffer is (by sniffing the return from some initialization call perhaps?) then you should be home free.
Ahh, I see. Hmm, in that case the final Blt() call should be to the front buffer, correct?
It makes sense.
But I want to write generic code, able to work with all D3D7 things without need of special switches to handle specific anomalies (well, for what's possible).
Thinking 'paranoid' for a minute... could it be possible that a game squeezes all 3D rendering and Blt() calls during the VBlank period of the monitor? That would mean no screen tearing, and yet it's only 1 buffer necessary -> always the Front buffer.
Please tell me it's just paranoia and that you know reasons that make a single-buffer-solution unacceptable and not worthy to account for.
I guess that's possible in principle, but it would be a VERY strangely architected program that did that. Besides, it would be just a silly decision from a resource-use point of view; you'd be massively underutilizing the GPU and therefore greatly restricting how complex your scenes could be, etc. Not to mention that I'm not sure that kind of close synchronization with the CPU, the GPU, and the vblank is even possible on a PC (as opposed to a console, where you are closer to the hardware).
I think you can safely assume that 99.9% of games use the standard double-buffering mechanism.
Thanks for clearing my doubts . Sometimes I don't think straight.
Reading again the docs, I see that IDirectDrawSurface7::Blt() wants an IDirectDrawSurface7 parameter in input - the source surface to blit from.
Sadly, it does not state that the two surfaces have to be distinct objects. I hoped to find written undefined behavior if you aliased the same surface, but no warning is given. Is it foggy documentation leaving gray zones, or is it precise documentation to be taken to the letter? Guess I'll have to try the alias and see what happens.
But I believe you nailed it. To employ only 1 buffer could cripple the performance and introduce technical difficulties. It is faster and simpler to employ double buffering.
Ciao ciao : )