Tuesday, January 7, 2020

The Future of Perception




Dec 2019, Ars Technica

Graphics cards, backpropagation, and big data – the ingredients for the deep learning revolution.

Our sense of smell is a great example of a deep-learning-style brain module, or to use a fashionable term, brain ensemble. The processing center of olfaction is the piriform cortex, the structure of which looks pretty similar to the pyramid diagrams of neural networks (and hence the origin of its name).

Furthermore, our olfactory system works a lot like a black box – we don't know what it's doing in there. The receptors do not code molecules one-to-one and they don't seem to correlate to perception in a meaningful or predictable way.

Our autobiographical memory, which is controlled and enriched by our olfactory perceptions, is a pretty big dataset. It holds all your personal memories, but including the physiological data as well, such as heart rate, hormone patterns, even sensorimotor data.

Now I'm not sure where the graphics cards come into the picture, something about parallel processing I guess. Olfaction uses around 400 receptors and vision uses 4 (three cones and one rod). I'm not a computer scientist, but GPU is like the opposite of CPU (in this context of neural nets) just as parallel processing would be opposite to serial processing. I’ll have to let someone else articulate that analogy.

Anyway, graphics cards, backpropagating neural networks, and big data are building the "deep" revolution, and I am waiting for the day that olfaction takes its role as the model sense for helping us understand and interact with our omnipotent artificial overlords. 

No comments:

Post a Comment