Jok:
There is absolutely no need for 100s of inputs, far less thousands or whatever crazyness you are talking about.
Maybe your understanding of neuroevolution is from fixed topology networks,but NEAT is not fixed topology; it only STARTS with a simple perceptron network, it branches out from there and is capable of making recurrent networks as well! With recurrent networks there is no need for an input for every possible variable, as the neural network is capable of calculating these input values on its own.
In other words, all those inputs are not necessary!
If you want an example, check out how fast sharpNEAT creates a solution to non-markovian pole balancing. With regular markovian pole balancing, the neural nets have inputs for the velocity of the cart, velocity of the pole, position of the pole, and position of the cart. With nonmarkovian, you don't provide any velocity inputs, so the neural net has to figure out how to calculate that on its own; in sharpNeat i've seen this happen in 26 generations.