Friday, February 24, 2017

Almost Truth

When you search ‘breath of fresh air’ and every picture has people with their arms open wide. (What’s up with that?) image source

Dec 2016, phys.org

There seems to be this debate, or perhaps I should just call it confusion, over whether or not we can smell non-organic molecules like ammonia, chlorine, or sulfur. From what I can get out of people who are professionals in chemistry, smell science, or what have you – we cannot smell these things.

When we smell the ‘chlorine in the pool,’ we are actually smelling chlorine as it mixes with other organic molecules to make chloramines (and the so the smell of chlorine, which most would consider clean and disinfected, is actually the smell of a dirty pool, because the cleaning agent chlorine is mixing with all the organic garbage poop molecules in the pool). “Sulfur” is the smell of sulfur mixed with other organic molecules. Some people say we can smell ammonia, but I bet it’s the same situation.

While we’re talking about it, “metal” is not the smell of metal but the smell of something, an organic something (like our sweaty hands), interacting with the metal.* (I have a smell in my vocabulary called ‘metal mold’ and although I’m not sure what it is, its smell is powerful and unmistakable...and it's on my fire escape sometimes.)

So when I hear this – "mice can smell oxygen" – I have a feeling it’s not as it seems. And sure enough the truth reads like this:

They don’t smell oxygen itself, but the “levels of oxygen in the air.” They also don’t use odor receptor genes to do this; they are chemosensitive genes, but not odor receptor genes. And also, in humans, these genes are non-functional (called pseudogenes or junk genes), so we can’t generalize this to humans, only mice.

Can we say that mice “smell” oxygen? That’s like almost the truth, almost a fact. They can sense it. And if we consider chemosensation to fall under olfaction, just for simplicity sake, then sure, they can. That’s how almost truth works, isn’t it? And for the record, this is one of the reasons Hidden Scents is subtitled …’the age of approximation.’ The very thing that is so commonplace in studying or simply experiencing the world of olfaction is fast becoming the norm in how we interact with the Noosphere, the total collection of facts and knowledge.

*Credit to author Alexandra Horowitz; I got this from her book Being a Dog, it just came out in 2016, and is absolutely fascinating.


Wednesday, February 22, 2017

Outer Space Makes its Own Tastes

Suntory Time, Lost in Translation 2003

Here at Limbic Signal, we don’t always talk about taste (but when we do, we copy it directly).

[Also note, this is a repost from my previous blog, so it’s old news. Too good not to bring it up again though.]

Although this recent article on space whiskey doesn’t even mention Suntory, the Japanese distillery of Bill Murray lore sent its sample into outerspace as well, and we still await their results.

Scottish distillery Ardberg has delivered the results of its own space-aged whiskey. In keeping with the pursuit of an omnipotent odor lexicon, the descriptions here are taken directly from the article:


Earth sample: "The sample had a woody aroma, reminiscent of an aged Ardbeg style, with hints of cedar, sweet smoke and aged balsamic vinegar, as well as raisins, treacle toffee, vanilla and burnt oranges.
"On the palate, its woody, balsamic flavours shone through, along with a distant fruitiness, some charcoal and antiseptic notes, leading to a long, lingering aftertaste, with flavours of gentle smoke, tar and creamy fudge."

Space sample: "Its intense aroma had hints of antiseptic smoke, rubber and smoked fish, along with a curious, perfumed note, like violet or cassis, and powerful woody tones, leading to a meaty aroma.

"The taste was very focused, with smoked fruits such as prunes, raisins, sugared plums and cherries, earthy peat smoke, peppermint, aniseed, cinnamon and smoked bacon or hickory-smoked ham. The aftertaste is intense and long, with hints of wood, antiseptic lozenges and rubbery smoke."

Language Models



Yann LeCun in this talk for the Wired Business Conference 2016, title AI Arms Race, notes how the best natural language models are now deep learning based. At Limbic Signal we find this funny because our sense of smell works akin to the deep learning model, and yet the language of smell is an unwieldy concept. Alas, it is unwieldy because we measure wieldliness by classical means. We have entered the age of approximation, however.

Non Traditional Computing, Complex Problems, and Approximation

Illustration for Death of a Salesman by Brian Stauffer for the Soulpepper Theater Company, Toronto

It may seem like a stretch to write about combinatorial optimization problems (aka the traveling salesman problem) on a blog about the ‘language of smell,’ but Limbic Signal isn’t just about smells, or language, but the connections between olfaction and computation. Our olfactory system is a champion at dealing with very large, very complex datasets.

Olfaction uses our brain in ways the other senses don’t. Some of the ways olfaction diverges from the other senses are akin to novel solutions to very complex problems in computation, such as big-data-sifting, pattern recognition, or the aforementioned traveling salesman problem.

 Also note that, in addition to the magnet network described below, another unconventional solution to the traveling salesman problem is to use mold. In fact, slime mold was used to design Spain's motorways and the Tokyo rail system.

So this article below does a good job of explaining the traveling salesman problem; I straight copied it from the writers at phys.org. And in the second section is an explanation of an interesting solution to the problem.

Researchers create a new type of computer that can solve problems that are a challenge for traditional computers

The traveling salesman problem
There is a special type of problem - called a combinatorial optimization problem - that traditional computers find difficult to solve, even approximately. An example is what's known as the "traveling salesman" problem, wherein a salesman has to visit a specific set of cities, each only once, and return to the first city, and the salesman wants to take the most efficient route possible. This problem may seem simple but the number of possible routes increases extremely rapidly as cities are added, and this underlies why the problem is difficult to solve.

...
It may be tempting to simply give up on the traveling salesman, but solving such hard optimization problems could have enormous impact in a wide range of areas. Examples include finding the optimal path for delivery trucks, minimizing interference in wireless networks, and determining how proteins fold. Even small improvements in some of these areas could result in massive monetary savings, which is why some scientists have spent their careers creating algorithms that produce very good approximate solutions to this type of problem.

An Ising machine
The Stanford team has built what's called an Ising machine, named for a mathematical model of magnetism. The machine acts like a reprogrammable network of artificial magnets where each magnet only points up or down and, like a real magnetic system, it is expected to tend toward operating at low energy.

The theory is that, if the connections among a network of magnets can be programmed to represent the problem at hand, once they settle on the optimal, low-energy directions they should face, the solution can be derived from their final state. In the case of the traveling salesman, each artificial magnet in the Ising machine represents the position of a city in a particular path.



Wednesday, February 15, 2017

Human Evolution in Action

AKA Anosmia of Androstenone


Humans are not done evolving. The most readily available evidence for this is the vast difference in our abilities to smell. Of the 400 types of olfactory receptors used to code all smellable molecules, humans evidence a sizable difference in the way they use their olfactory switchboard. For example, specific anosmia, or odor-blindness to one particular smell, is very common.

Chances are that one of every two people reading this is anosmic to something. (I would suggest buying the boardgame P.U. The Guessing Game of Smells, sniff through the scented cards, and note which one doesn’t smell bad at all – that’s the one you’re anosmic to.)

In a recent report, scientists are busy decoding these differences in our genetic artifacts using them to chart the distribution of cultural forces on genetic evolution. If you can’t smell androstenone, your OR7D4 gene is turned off. But in the on position, this gene can be coded to “perceive” androstenone as either sweet and floral, or sweaty and urinous.

Androstenone, a derivation of testosterone, is found in humans and especially in male pigs. In fact, to those who find androstenone unpleasant, uncastrated boar meat tastes gross. Many of these people are from Africa. And many of those who can’t smell it at all are from the Northern Hemisphere. Note that the domestication of pigs began in Asia. In a culture where pig is often on the menu, it would not be a good thing for you to smell your dinner as piss and sweat. Over time and across generations, this genetic variant would be deselected.

Here’s another interesting aromatic molecule – isovaleric acid. It is the smell of vomit, sweaty feet, and Parmesan cheese. With no clues given, most people describe it as either disgusting or delicious, and in a totally unpredictable 50/50 distribution. The same people even, at different times, will report different answers.

In this case, it isn’t yet a matter of genetics. Instead, the brain is evolving in real-time, vacillating, feeling, “thinking” about how to perceive this thing.

Follow me here – animal husbandry begins before dairying practices, and after that, cheese-making. There are certain sequences in cultural evolution, and this is one of them. Is it safe to say then, that in some quantity of years, in generations to come, that Parmesan cheese-eating populations will lose their ability to smell isovaleric acid altogether?

Or one day even further in our future, when our cultures have become so far removed from our genetic history, will we stop smelling everything?


Tuesday, February 14, 2017

Fecal Matters


Look, I know this article is a bit old, but, come on – do intros like this really ever go bad??

Looking out for No. 2: Dogs sniff out fecal pollution at Jersey Shore

Can't do better than this intro:
"Some specially trained dogs are helping humans curb themselves.

"A company that has trained dogs to recognize the smell of human fecal bacteria has been sniffing out sources of water pollution nationwide, discovering broken sewer pipes, leaking septic tanks and illegal sewage discharges, to the delight of environmental groups and government agencies.

...
"The dogs are trained to ignore fecal bacteria from animals, and customers often try to trick them (unsuccessfully) during evaluations of the company. Waterway contamination from bird, wild animal and pet waste is also a significant source of water pollution.

...
"After the dogs found an upstream source of sewage from a faulty septic tank, repairs were made, and the beach saw much fewer closures, Haley said."

Wednesday, February 8, 2017

The Smells of Expansion and Contraction

wrinkled tesseracts
In the contemporary parlance of computer security, this is called a "zero day exploit": Olfaction is the back door to our un-thinking mind, and big business knows this.

Smells can represent spatiality to us, and unconsciously direct our decision-making. This can be used to the advantage of 'retail sales' by using clever olfactory design of sales spaces to "trick" our emotionally-reined limbic system.

Using a scent reminiscent of enclosed spaces, like the smell of firewood, and a scent evoking open spaces, like the seashore, scientists found benefit in augmenting retail spaces with their complementary smells.

It is interesting to think that space itself can be distinguished by its smell, but on second thought, of course it can. Half of smell is place. That is to say, the hippocampus, or "place" part of the brain, plays a critical role in smell/memory.

Closed spaces and open spaces both affect us psychologically, and the co-located smells of those places would do so as well.

Thursday, February 2, 2017

Bad Information

I'm not sure exactly how they got this image, but it sure looks like it came from the Google Deep Dream project where a deep learning network was asked to 'dream' about images and produce 'overpreceived' images, which look a lot like hallucinating on psychoactive mycotoxins.

Is there such a thing as Bad Information? If so, what is the difference between Good and Bad? How do we know that difference?

Artificial intelligence, but information theory in general, is a common theme in Hidden Scents. How can you not write about it these days? We are computers. At least, we are becoming computers. Or they us. At least, that's what say the analogies we use to make sense of our world. Do we know anything aside from the analogies we use? (We should probably be asking Douglas Hofstader about that one)

Back when pneumatics was the technology du jour, we thought the nervous system worked according to pressure in the nerves. That was correct for the circulatory system, but the utility of that analogy ended there. Eventually, the computer analogy will run out, but until then, we are computers. And these days, specifically we are computers learning to recognize patterns in our environment using forward-feebacked layers of feature detection.

This brings us to the premier of a new infotech textbook.

“The Deep Learning textbook is a resource intended to help students and practitioners enter the field of machine learning in general and deep learning in particular.”

Deep Learning, An MIT Press book
Ian Goodfellow and Yoshua Bengio and Aaron Courville, 2016

It's a textbook, so it's too technical for the interested layperson. But there is some goo dintroductory materials that could help straighten things out for people who want to know what it is, but don't have the contect-specific knowledge to digest the whole thing.


Here's from the chapter on Information Theory:

“Likely events should have low information content, and in the extreme case, events that are guaranteed to happen should have no information content whatsoever.

“Less likely events should have higher information content.

“Independent events should have additive information. For example, finding out that a tossed coin has come up as heads twice should convey twice as much information as finding out that a tossed coin has come up as heads once. “

The text then goes on to translate these maxims into mathematical formulae.


Sometimes someone says something and I'm like, wow, that was really stupid. But then later on, when I try to think about -why- it was stupid, I find it difficult to articulate. Above we have a good rationale for explaining why a particular statement is 'stupid' or not. It depends on how much information it has. And this is how we measure that information. In laymen's terms, we would call this the Captain Obvious principle. If you just said something that everyone already knows or should expect, but you said it like it's got good information value (as if nobody knows or expects it) then that would come across as stupid.

There we go again, turning a branch of applied mathematics into a magnifying glass for human behavior; probably not what the authors of this text intended to be done with their work.

Anyway, I like the word hard-coding. They use it to describe the 'older' way of writing-in knowledge about the world into a program (instead of 'letting the program figure it out for itself,' as these newer deep learning programs are done).

They point out in the introduction that "A person's everyday life requires an immense amount of knowledge about the world. Much of this knowledge is subjective and intuitive, and therefore difficult to articulate in a formal way. Computers need to capture this same knowledge in order to behave in an intelligent way. One of the key challenges in artificial intelligence is how to get this informal knowledge into a computer." Instead, when computers get their own data, by extracting patterns from raw data, this is known as machine learning. Deep learning is a type of machine learning.

Still, figuring out which details are valuable and which are inconsequential is the hardest part. -Disentangling- is a word emphasized by the authors. That's a favorite word in Hidden Scents as well. So is inextricable, the information-opposite of disentangle. So is disambiguate, the big brother of disentangle.

If you're into this stuff, and a bit more on the application side than the theoretical side, you might want to check this book out. And if you're just into machine-generated hallucinations, or if you've ever tripped on psilocybic mushrooms and want to see something reminiscent - very reminiscent - unnervingly reminiscent - check out the front cover.


notes:
Analogy as the Core of Cognition, Douglas Hofstadter, Stanford lecture, 2006

Wednesday, February 1, 2017

White Smell Machine

[compulsory image]

Another piece of news sitting in my to-post box for way too long:

Brothers Lav Varshney with the University of Illinois, and Kush Varshney, with IBM's Thomas J. Watson Research Center have taken the idea of White Smell further by creating a mathematical model that could be used to create a working 'olfactory white machine' which by using an olfactory ananlog to White Noise could render a smell unsmellable.


Their white smell machine could be used to improve indoor air quality, or by the food industry to cancel bad smells.