Computational simulations of animal brains show that biological networks evolve a hierarchical structure because they have fewer connections and thus use less energy, so not for efficiency but for sparsity. Researchers hope the findings improve the attempts to create artificial intelligence.

So efficiency in the sense of encoding doesn't mean "energy efficient" or "use the fewest number of neurons to represent information". Instead, I think what they mean by efficiency here is in the sense of Information Theory. In the 50s/60s it was argued that neurons have to code efficiently in the sense of reducing redundancy (Shannon entropy), so more like how you'd describe a computer network. Basically, how do you maximize the information a neuron's activity can transmit. This works well to describe certain neurons (large monopolar cells in fly eyes).

Another idea pops up in the 90s called "sparse coding". In this paper they basically use an algorithm to find a set of principle components from natural images. In essence, they found a set of smaller images that they could combine to represent any natural image, and this is their "sparse" code. There's a small, sparse set of building blocks that can be combined in all sorts of different ways to represent all natural images. Now, the set that they found matches almost exactly the receptive fields of neurons in the visual cortex.

Sparse coding is attractive because it fits very nicely with how we know the brain works. Especially for vision, there are "stages" of information processing, and more and more areas get recruited to deal with more detailed information. You start with "pixels" so to speak in the retina, which get combined into circles, which combine to form lines, which combine to form edges, and so on and so forth down the visual path. Each stage is getting input from the last, and different combinations represent different types of information.

So, while sparse coding seems "efficient", it's not necessarily efficient in the sense of reducing redundancy of information (especially since information blows up as more and more neurons are used to detect more specific kinds of information).

/r/science Thread Parent Link - eurekalert.org