Saturday, June 30, 2018

Stand Corrected on Smelling Robots



It’s already happening, in Edinburgh: Robot noses are taking our jobs – doctor’s jobs, that is. We already know that dogs can tell when you’re sick just by the way you smell. And maybe less of us know that dogs can smell the place on your body where the sickness comes from, like if you have some kind of cancer hiding inside you. Alexendra Horowitz went into great detail about that kind of magic in her book on dogs’ smell.

It’s different now, however, because these aren’t dogs but computers. To get a bit more specific, it’s a gas-sniffing machine (called a GC-MS spectrophotometer, gulp, the de facto artificial smelling machine) combined with a special kind of ‘computer’ called a neural network.

If you’ve ever read my book or my blog or you’ve not been under a rock for the past 5 years, you’ve heard of neural nets. They are these magical new* ways of computing that created Google’s DeepDream and AlphaGo and every other headline where a computer did something we never thought a computer could do (like to dream and make art, yes). And now they smell.

But not really; we’ll get to that. First, it’s important to point out that this news comes from Nvidia, who makes GPU chips, which are not CPU chips. The computers we use, and have used forever, run on CPU chips – that’s the way it’s always been. Then the part of the computer that does the graphics, a GPU, started to do more and more of the computing (CPU).  We heard about GPUs first in regards to video games, but then because of Bitcoin because they use tons of interconnected GPUs to do their mining (and yes all those gamers got pissed because the price of GPUs exploded in tandem with the cryptocurrency bubble).

GPUs do more than provide smooth, clear graphics for your video games or authenticated cryptocurrency for your third world country blackmarket terrorist druglord network. They make a computer more like a brain, and hence the term artificial neural network.


Brains are all interconnected – neurons and axons, hub and spoke. Neural nets, with their GPU-neurons, approximate a brain better NOT because of a better algorithm software, but a better hardware. And with all this, we’re seeing artificial intelligence explode – I hate to say it – but it’s happening just like Ray Kurzweil said it would.

So after beating a human at Go, after successfully debating a human on the benefits to humanity of space travel, after creating its own language that humans can’t even understand, after detecting health abnormalities in patients’ xrays better than doctors, and after being able to play paper rock scissors so well that it can predict what we will throw before we throw it and hence beat us every single time – the damn thing now smells. (The paper rock scissors example is simply processing speed – the system sees our hands about to make a shape, and counters so fast that to us it seems like it happened ‘at the same time.’)

But let’s not get ahead of ourselves here. First thing to note is that this thing is not smelling. It’s been trained to recognize a very small subset of molecules related to cancer.  Whereas humans can detect any volatile organic molecule (rough definition), this thing can only detect what we’ve trained it to detect.**  And this is not the first time a system has been trained to smell – it happens a lot with bomb sniffing, for example, and artificially augmented bomb sniffing remote control cicadas are also real. Anyway, next is where I have to geek the F out: the part of us that smells IS a neural net.

Granted, our whole brain is like a neural net (yes, hence the use of the words ‘neural net’). But the part of our brain that specifically processes, or organizes the electrical signals from molecular contact and turns them into electrical signals for perception, is a pyramid-structure network (it’s called the piriform cortex for that reason, but it’s also known as the olfactory cortex) where hundreds of receptors are whittled down to a few signal fibers. And if you’ve ever seen a picture of a neural net, well, it’s the same thing.

This is one of the underlying themes in my book, and one of the reasons I was compelled to write it. Our sense of smell, the most under-studied of all the senses, is actually more like the most advanced technology there is right now, that being artificial intelligent brain-like systems. I like to call them intelligentities (which is gender neutral btw, and also neutral on some other thing we aren’t even upset about yet, where we make a biased distinction between humans and computers).

Although it seems like we’re making serious progress in this area, I still assert that studying olfaction is an ideal way to optimize these kinds of systems. Until then, you can rest assured that although these things can already do basically everything better than you, they still can’t smell.  (And many of us will have to wonder – is that a bad thing? I.e., will humans in the distant future, once we have the option, will they still want to smell?)

*Marvin Minsky et al were talking about neural nets in the early 80’s but the hardware wasn’t there yet to make them sing.

**Artificial Intelligence can only do what we train it to do. And this is a major part of the inherent biases that show up in these programs, and the reason we need to do a better job of choosing their training programs and then testing these programs to see if they discriminate and against who. Search up this phrase to find out more – ‘man is to computer programmer as woman is to homemaker.’

Notes:
Image source: Olfactory Bulb (aka non-artificial neural network)

Article source:
June 2018, nvidia.com

Wiki:

No comments:

Post a Comment