I'm writing a function which has to determine and return the visibility status for (3D) objects.
Now, you can picture how many times per second this function is being executed.
Speed is critical.
A lot of optimizations went into it and I ended up using 16 flags (besides the other variables) which get declared, used and destroyed every time the function code is being called, executed and terminated.
These flags simply need to hold a 2-state value, wether it is true/false or 0/1 makes no difference.
I can't make up my mind about the data type of these flags.
I made some simple benchmark test using both bool and dword for the flags.
Surprisingly the benchmarks showed pretty much the same results (100 millions operations in roughly 6.2 seconds -on a Intel PentiumIII 1 GHz-).
At first I ran the bool bench, and then the dword bench.
They outputted a 6.2 secs for bool and over 7.4 for dword.
But then I ran dword bench before bool bench and now the output was 6.2 for dword and, again, over 7.4 for bool :wacko:
So it seems it's a sort of first come first serve: whoever runs first gets the best result (might be due to CPU heat raising during the 1st test?)
Well, in the end both data types give the same results.
So I can't decide, what is better: dword or bool?
Thanks for reading Regards.