It has been a while since I touched on the topic of ANN and GA. As you know, a neural net contains a set of neurons, which can be organized in many different ways. The activator for a neuron is where some fuzzy logic resides. For example, using a sigmoid function with a weighted value determines whether the neuron fires or not. With supervised training, you could use backpropagation to calculate and generalize the weights across all neurons such that they produce the output for a given set of inputs within an acceptable error rate. With unsupervised training, the output is indeterminable and thus a different solution is needed. A genetic algorithm could be used to manipulate neuron weights. Essentially, each gene within the chromosome acts as the weight for each neuron. The fitness score given to the chromosome is based on the success of the output generated by the neural network. If the chromosome produces bad weights, your ANN will produce bad outputs, which your fitness scoring algorithm should detect. Your fitness scoring algorithm is where the bulk of your fuzzy logic will reside. Since the solution is complicated, you need to define a set of acceptable rules to score and train the neural network.
That's pretty much what I remember. I wrote an unmanned vehicle training simulation a while back that was based off this system. If you're looking for something more mathematical and proven, you will need to read some books on the topic. Philosophy and discrete mathematics are two other subjects I would suggest you look into. Those topics can help you build sophisticated rules that govern your training.