I just had this crazy idea, do you think it would work?
what if you sequentially stored byte code in a texture, and "interpreted" it on the gpu...
Pass a texture for the variable data (that the cpu modifies) and pass a texture for the actual script.
Im thinking mostly ai/actor code you would use this for.
how big a texture would you need, and would this be feesable?
Maybe interesting if you have a very large number of agents (like 1000s) that are executing relatively simple behaviors and have relatively little state.
You'd also have to consider how to get the data out of the texture to make the agents do their things, e.g. playing animations and moving around. That stuff is usually done on the CPU, but you don't want to do readbacks from video memory.
People have proposed this idea before for particle systems. The idea is each texel represents one particle, and you have a set of textures storing position, velocity, etc, and you use vertex program texture fetches to render the particles in the world.
more importantly is how do you handle the information on the texture? you would also have to interpret the script. not saying it's not possible, i couldn't do it.
I don't know if scripting on the GPU is feasable, because it usually involves a lot of callbacks into the engine, but GPU AI is possible.
Hmm. I didnt think about readbacks...
Ive thought a little more, and I think you could implement it by rendering a single point per entity, you could handle more than one entity at once tho for sure.
Maybe if you used the geometry shader, and outputted a pixel sized point (i know this is getting a little nuts hehe) for each "actor arm" you could set so many arms to render in a small texture to read back into the cpu. so say you used a 1x8 texture, you tell it to render "movement distance" and "animation frame" then whatever else you needed on different pixels in this "gpu output" texture... then read it back.
CPU TO GPU
1 by x variable texture (possibly 32 bit floats)
OPERATE ON GPU (prestored in video ram)
256x256 script texture, youd have to organize how you would like the byte code.
GPU TO CPU (maybe a floating point or byte texture would do, cause reading back is really slow isnt it...)
1 by x output texture, which is written by the geometry shader for explicit points on different pixels.
well, im crazy enough to think it might work, but how extensive the scripting language could be carried out in the geometry shader I dont really know... maybe your right Reedbeta and it suits simple ai with lots of entities, but maybe you wouldnt get that many entities because there would be too many read backs, dunno.
The cool thing is, it would make plugging in ai really easy, just add a shader... thats what really sparked me up about the idea.
marcgfx, youd have to implement a calculator in the shader... sound impossible?
just imagine reading 2 variables and a operator in, you could make it all a step by step procedure, one operation at a time.
you could even have a "goto" command, you just change the texture coordinate instead which stands for the instruction pointer.