People interact with digitized sensory capabilities often, but not broadly: “sight” (camera, lidar, radar), “hearing” (microphones), and “touch” (prosthetics with feedback implants). But the AI nose is the vanguard.
The good news is, Aryballe, a Grenoble, France-based digital olfaction firm that uses biochemical sensors, optics, and machine learning to detect odor and turn it into data, is on it.
However, it is profoundly complex. Humans can distinguish more than one trillion scents, according to a 2014 study by the U.S. National Institutes of Health, published in Science magazine.
But beyond creating a digital nose, Aryballe is digitizing scents – a “Shazam for smells,” said Sam Guilaumé, the company’s Chief Executive. There are kinds of smell libraries in France, Germany, Scotland and America.
But pairing AI-enhanced detection with a library is new. “Ones and zeros are just data. I like to think we turn smell into knowledge,” Guilaumé said. “We essentially make olfaction quantifiable and objective, and enable someone to make a decision based on our information.”
Why digitize the ability to smell?
Founded in 2014, Aryballe has operations in France and America, and clients in the food and beverage, consumer packaged goods, and automotive industries.
With a patient receiving the first fully digital prosthetic eye in 2021, the idea of assisting a person with total or partial loss of smell — anosmia and hyposmia, respectively — with a digital olfaction implant is plausible. It’s the purpose of the European Union-funded Rose project, in which Aryballe is a lead participant.
“ONES AND ZEROS ARE JUST DATA. I LIKE TO THINK WE TURN SMELL INTO KNOWLEDGE.”
The project aims to help the 20 percent of the world’s population with some form of smell loss.
“The objective is to explore how the Aryballe sensor could help link artificial systems to human biological olfaction, and enable an anosmia patient to recall their sense of smell,” explained Guilaumé.
It matters because the tiny molecules that make up smells alert us to danger, evoke memories, trigger emotions, help with attraction, excite other senses such as taste, and even enhance the experience of sight and smell.
But AI-powered noses might have some less-obvious uses. Guilaumé talked about how an autonomous vehicle fleet operator, for example, would not have a driver to keep passengers from smoking, or eating, say, tuna. “To maintain a level of cleanliness, you need a device such as ours that can monitor odors in the cabin.”
Why is it so hard to achieve?
Humans sense smell when an object emits odor molecules that are carried through the air into our noses when we breathe. The tiny molecules activate olfactory neurons in the nose as they pass across specialized nerve cells.
These identify smells, triggering memories and evoking emotions. A human nose has several million such cells, which make up the 500 or so receptors that can identify specific odor molecules.
Aryballe uses organic chemistry that mimics the nose, explained Guilaumé. Smell molecules bind with volatile olfactive compounds on digital sensors, and they create a kind of smell image.
The image is processed through Aryballe’s knowledge base, and then machine learning makes a recommendation about its nature. This takes 10 to 20 seconds, which is about as long it takes humans to react to most smells.
The key moment in digital olfaction was in October 2004, when the Nobel Prize in Physiology or Medicine for 2004 was awarded jointly to Richard Axel and Linda Buck for their discoveries of “odorant receptors and the organization of the olfactory system.”
Essentially, Axel and Buck discovered how the brain converts scent into a signal that can be recognized, remembered, and associated with emotions. Prior to that, the industry was in the dark, said Guilaumé.
“People thought smell was a mix of gasses, and that with gas detectors, you could derive a sense of smell. But your nose can’t smell gasses such as CO2 (carbon dioxide), carbon monoxide, or NOx (nitrogen oxides),” he explained. “Our digital nose uses similar receptors to those in your nose. Whatever your nose smells, our sensor can smell.”
Indeed, for man-made products, Guilaumé said Aryballe’s technology can outperform a human nose with cheese, wine, fragrances and things that are “a few centuries old.” “We can distinguish not just between vanilla and vanillin, but between vanillin from two different producers.” However, some smells remain challenging. “It will be difficult to compete with the reptilian reflexes that prevent us eating something which is spoiled, or getting too close to something decomposing.”
What’s now? What’s next?
To classify and archive any object, image, sound, or smell, it needs labeling. Digital image labeling, or annotation, is crucial for all autonomous robotics work.
For example, for autonomous vehicles to identify a bicycle, thousands of images of different bicycles are labeled—usually by humans, but increasingly by AI—so the vehicle knows that when it detects a certain kind of two-wheeled device, it should treat it as a bicycle and not a pushcart.
It’s a similar process for smells, as Aryballe builds out a library of thousands of smells at varying levels of concentration, all of which could someday be used to make more combinations.
Challenges abound, from sourcing the smells, and recruiting experts to annotate odors, to identifying the primary components of a given scent — which is still not possible.
Guilaumé draws an analogy with the three primary colors, red, yellow, and blue. “We don’t know the primary odors, or if they even exist,” he says. “And if they do exist, we don’t know how many of them there are. But with the devices we are developing, I believe we will eventually identify them.”
In 1953, James Watson and Francis Crick discovered DNA’s twisted-ladder double helix, and only 50 years later, scientists sequenced the human genome.
So it’s not such a leap of logic that smell could be digitally codified, to be recreated, or eliminated, in the way that some headphones cancel noise. “We will be able to do this,” said Guilaumé. “The limitation for now is that this only works if you have the counter-odor ready.” Something that might come in handy for anyone riding in an autonomous vehicle after a previous rider enjoyed a particularly odiferous lunch.