The contribution of each input is the weight you assign to it. These weights are what you change when you train your network.
If you mean what contribution each beginning input makes on the final output then it is a bit more tricky. If this is the case I assume you are asking how much you should change the weights on the lowest level when all you know is what the output of the final level should be.
There are several algorithms for figuring the weight change and they all revolve around the same idea. If your output is too high you lower the weights from the nodes that are high and raise the weights of the nodes that are low. you follow the same logic back to the next level. for those nodes that are too high you not only lower their weight, but you also adjust the weights to them just as if they were an output that was too high. This process can be continued all the way to the origional inputs.
If this also is not what you meant, then im afraid you will have to try to explain again what your trouble is.