Thursday, August 31, 2017

Olfaction Meets AI

Headline reads like this:

Aug 2017, BBC

And inside:

Nigerian Oshi Agabi’s modem-sized device - dubbed Koniku Kore - could provide the brain for future robots. It is an amalgam of living neurons and silicon, with olfactory capabilities — basically sensors that can detect and recognise smells.

And an explanation:

While computers are better than humans at complex mathematical equations, there are many cognitive functions where the brain is much better: training a computer to recognise smells would require colossal amounts of computational power and energy, for example.

The prototype device shown off at TED - the pictures of which cannot yet be publicly revealed - has partially solved one of the biggest challenges of harnessing biological systems - keeping the neurons alive. "This device can live on a desk and we can keep them alive for a couple of months," Agabi told the BBC.

And what do we think about this?

As much as this story is pretty nuts (if the sentence “They can live on a desk” doesn’t make your head spin…), it’s all too common a story in the tech world. Not that it’s fake news or anything, but let’s just say it is misleading to talk about “smelling robots” in this way.

The less interesting truth is that they can only be trained to smell specific molecules, not even signatures, or combinations, of molecules. A system able to smell “anything that might come up,” and able to use that information for something important, such a system could not be trained. Well, hmmm,  we get trained to do this from birth, in fact we are already learning about our olfactory environment in utero.

So if we want AI to meet olfaction, what we need to do is keep them alive for a lifetime, and give them a body, and friends and a job. You know, just like a real person. They would need to learn from the ground up, just like a real person.

However ---

There is a point being made here by Mr. Agabi that is totally in-line with the thesis of Hidden Scents. The way we use computers today will eventually be supplanted by something else. Traditional computation will still be useful, but something else will take us beyond the capacities of today’s technology (whole lotta talk in the sci-fi sphere of quantum computing, for example).

As of now, neural networks are taking us in a new direction. Granted they were used back in the 80’s, but only recently have they become a marked change in computing technique. (I like to note here the contemporaneous link between the architecture of neural networks and how it is the same thing used to mine bitcoins – the processor is no longer the key component, it’s how many graphics cards you have all wired together.)

The olfactory bulb, the crux of the olfactory system, from an information processing point of view, is a model neural network. And the fact that it’s already connected to the limbic system – the thing that makes us move, the thing that makes our bodies work, and even our emotions – this makes it a model system for so much more.

*Anyone with more comp sci knowledge than me please feel free to correct as I am no expert and speaking in pretty broad, possibly misunderstood, terms.  

No comments:

Post a Comment