The feel of a cat’s fur may reveal some information, but seeing the cat provides important details: Is it a house cat or a lion? Although the sound of a crackling fire can be ambiguous, its smell confirms that the wood is burning. Our senses combine to provide comprehensive understanding, especially when individual cues are subtle. The collective sum of biological inputs may be greater than their individual contributions. Robots tend to follow more direct addition processes, but researchers at Penn State have now leveraged the biological concept to apply it to artificial intelligence (AI) to develop the first fully integrated multisensory artificial neuron.
Read also: This creature crawled across the Earth more than 400 million years ago
Led by Saptarshi Das, associate professor of engineering and mechanics at Penn State, the team published their work on September 15 in the journal Nature Communications.
“Robots make decisions based on the environment they are in, but their sensors generally do not communicate with each other,” said Das, who also has joint appointments in electrical engineering and materials science and engineering. “A collective decision can be made through the sensor processing unit, but is this the most efficient or effective way? In the human brain, one sense can influence another and allow a person to judge a situation better.”
For example, one car might have a sensor that looks for obstacles, while another detects darkness to adjust the intensity of its headlights. These sensors individually transmit information to a central unit that directs the car to brake or adjust the headlights. According to Das, this process consumes more energy. Allowing sensors to communicate directly with each other can be more power efficient and speedy – especially when the inputs from both are weak.
“Biology allows small organisms to thrive in resource-limited environments, reducing energy consumption in the process,” said Das, who is also affiliated with the Materials Research Institute. “The requirements for different sensors depend on context – in the dark forest, we rely more on hearing than vision, but we don’t make decisions based on just one sense. We have a full sense of our surroundings, and our decision-making relies on the integration of what we see, hear, and touch, We smell it, etc. The senses have evolved together in biology, but separately in artificial intelligence. In this work, we seek to combine sensors and mimic how our brains actually work.
The team focused on integrating the touch sensor and the optical sensor so that one sensor modulates the other with the help of visual memory. According to Mohtasem Al-Karim Sadaf, a third-year doctoral student in engineering and mechanics, even a short-lived flash of light can significantly increase the chance of successful movement in a dark room.
“This is because visual memory can later influence and aid tactile responses to navigation,” Sadaf said. “This would not be possible if our visual and tactile cortex only responded to their respective monocular signals. We have the photographic memory effect, where a light turns on and we can remember. We integrate this ability into a device through a transistor that provides the same response.”
The researchers fabricated the multisensory neurons by attaching the touch sensor to a phototransistor based on a monolayer of molybdenum disulfide, a compound that exhibits unique electrical and optical properties useful for light detection and supporting transistors. The sensor generates electrical spikes in a manner similar to neurons that process information, allowing it to integrate visual and tactile signals.
It is the equivalent of seeing the light “on” on the stove and feeling the heat emanating from the stove – seeing the light does not necessarily mean that the stove is still hot, but the hand only needs to feel a split second of heat before the body. He reacts and moves your hand away from potential danger. The introduction of light and heat triggered signals that led to a hand response. In this case, the researchers measured the transcriptome of the artificial neuron by monitoring the signal outputs generated by visual and tactile input signals.
To simulate touch input, the touch sensor used a triboelectric effect, where two layers slide against each other to produce electricity, meaning that touch stimuli were encoded into electrical pulses. To simulate visual input, the researchers illuminated a single-layer molybdenum disulfide photographic transistor — or a transistor that can remember visual input, such as how a person might maintain the general layout of a room after a quick flash illuminates it.
They found that the neurons’ sensory response – simulated in the form of electrical outputs – increased when visual and tactile signals were weak.
“Interestingly, this effect resonates remarkably well with its biological counterpart – visual memory naturally increases sensitivity to tactile stimulation,” said co-author Najm Sakib, a third-year PhD student in Engineering Sciences and Mechanics. “When signals are weak, you need to combine them to better understand the information, and that’s what we saw in the results.”
Das explained that the artificial multisensory neuron system could improve the efficiency of sensing technology, paving the way for greener uses of artificial intelligence. As a result, robots, drones, and autonomous vehicles can navigate their environment more effectively while using less energy.
“The superior combination of weak visual and tactile signals is the key achievement of our research,” said co-author Andrew Pannone, a fourth-year PhD student in Engineering Science and Mechanics. “In this work, we have only analyzed two trends. We are working to identify the right scenario to integrate more senses and see what benefits they can offer.
Harikrishnan Ravichandran, a fourth-year doctoral student in Engineering and Mechanical Sciences at Penn State, is also a co-author of this paper.
Translated by Matthews Lineker from TechXplore
More Stories
What ChatGPT knows about you is scary
The return of NFT? Champions Tactics is released by Ubisoft
What does Meta want from the “blue circle AI” in WhatsApp chats?