guaw at September 19th, 2005 06:35 — #1
Working on a basic 3D engine I actually have a problem using my application with some graphic cards.
The engine is very basic :
Using a Form to select a "game", I create a new Control, then I use it creating my D3Ddevice. I load mesh and textures from .X files.
When the "game" end, I dispose all even the Control and then back to the game selection Form.
On my computer it's working well, but on my "test computer" I got a " Direct3D.OutOfVideoMemoryException " after creating/disposing 5 or 6 "games" but only with specific graphic cards .
I tried to look at the " Device.AvailableTextureMemory " : after each game the value is decreasing and when it's \< 0 i got the exception.
If I close and lauch again the program : the Device.AvailableTextureMemory back to it's inital value and i can restart the program to lauch 5 / 6 games again, until the next exception.
I try using only .X files without texture and the problem vanished so i think i must force the program to clean the textures loaded but i'm not able to.
I tried everything (Texture.Dipose(), Device.EvictManagedResources(), GC.Collect(), ...) but it's always the same and always with the same graphic cards.
If I catch the OutOfVideoMemoryException the games lauch but It would be better to prevent the exception.
So I would like to know if someone had an idea, experienced the same problem, or a good way to clean the TextureMemory.
Thanks in advance
guaw at September 19th, 2005 11:03 — #2
I found an answer using Pool.Default instead of Pool.Managed when I load textures with TextureLoader.FromStream(device, stream, width, height, mipLevels, usage, format, pool, filter, mipFilter, colorKey);
Does anyone have an idea why Pool.Managed raise a OutOfVideoMemoryException and decrease AvailableTextureMemoryeach time i run a "game" on my "test computer" with some graphic cards only ? and if I can avoid it without using Pool.Default?
roel at September 19th, 2005 11:27 — #3
if you use the d3d debug sdk version with debug messages on and something that monitors the debug output, you can check if d3d says on closing something like 'memfini!'. this indicates that you did things right. when you do things wrong, it complains.
bladder at September 19th, 2005 23:22 — #4
Well Pool.Managed creates 2 copies of your texture. One is a backup of the real one so that when the device is lost, it can be restored automatically.
Also, it could be something else that is leaking, not necessarily your textures. Check your buffers as well, and do what roel told you. Turn on directx debugging if you havent (from the control panel) and then run your program in debug mode and check the output window. d3d will tell you if you havent released all of your resources.
guaw at September 20th, 2005 07:30 — #5
Thx, i use "Dbmon.Exe" and the Directx debugging so in fact, even on my computer, Dx said "memfini!" only when i close the entire program (the windows form that lauch games), not only a game whith pool.managed and pool.default so i must have done something wrong when i release resources.
the bad is that didn't explain why pool.default is beter on the test computer,
the good thing is that i can try solution without moving my program each time on the other computer (the test one)