Thursday, April 8, 2021

Neuromorphic Buzzwords


Recent advances give theoretical insight into why deep learning networks are successful
Aug 2020, phys.org

It's just like olfaction.

If you didn't know what a deep learning neural network was in 2015 when Hidden Scents came out, you do now. Face recognition? Deep learning. Speech recognition? Deep learning. Deep fakes?? You guessed it. 

But why would someone spend an entire chapter of a book on smell talking about brain-like computing systems? Because the little part of our brain that smells is about as close as you get to a deep learning neural network.

And the story goes like this -- Big data brings Dirty data, which then brings the curse of dimensionality. It's not like mammals->dogs->poodles. It's like "that dog that bit me one time" and "the kind of dog that likes kids" and "dogs that were selected to hunt rodents" and "coyotes" and "pet cemetary" and "totem poles." Imagine a spreadsheet that has just as many columns as it has rows. For every rule there's an exception. 

What you probably know as a "computer algorithm" is just a bunch of rules. But when every rule has an exception, algorithms don't work so good anymore. This is the curse of dimensionality. 

This is also the chemosphere being described. Chemicals are myriad and ever-changing. Any means of chemosensation will have to employ something closer to a deep learning network than to an old-fashioned computer algorithm of IF/THEN functions. And that's why our olfactory system could really be called the deep nose, and why olfaction will become the representative sense of the Age of Approximation born of the datapocalypse. 

This thought-provoking paper does a much better job describing these networks, and makes implications for their use in society:

Tomaso Poggio et al. Theoretical issues in deep networks, Proceedings of the National Academy of Sciences (2020). DOI: 10.1073/pnas.1907369117

No comments:

Post a Comment