Multi-photon bionic skin realizes high-precision haptic visualization for reconstructive perception
en-GBde-DEes-ESfr-FR

Multi-photon bionic skin realizes high-precision haptic visualization for reconstructive perception

07/02/2025 Compuscript Ltd

A new publication from Opto-Electronic Advances; DOI 10.29026/oea.2025.240152, discusses how multi-photon bionic skin realizes high-precision haptic visualization for reconstructive perception.

Human palm skin contains more than 20,000 tactile vesicles, depending on the tactile vesicles in the skin depth, activation threshold, trigger mode and other tactile signal pickup differences, as well as cross-synergistic mechanism between them, so that the skin can obtain different types of tactile signals. And then through the brain nerve center on the tactile signal “calculation” processing, can realize the object features more comprehensive, more specific perception ability, and even can realize the tactile visual reconstruction perception, that is, based on the tactile object structure, shape, texture and other characteristics of the visualization of the perception. Inspired by this tactile perception mechanism of human beings, tactile sensors that simulate the function of human skin have attracted extensive attention. So far, electrical sensors based on the principles of resistance, piezoelectricity, and friction electricity have been able to mimic the tactile nerve to collect and process physical information by monitoring the change of the sensor's output electrical signal during the contact process. However, electrical tactile sensors also have problems such as potential leakage, easy to be corroded, not resistant to electromagnetic interference, low sensitivity, slow response speed. In contrast, the use of optical means as an information carrier to realize tactile sensing becomes an optional and ideal technological path, which has been demonstrated in multi-physical parameter fiber optic sensors.

Aiming at the problems above, the team of associate researcher Yu Yang from the Micro-Nano Optoelectronics and Intelligent Perception Group of the National University of Defense Technology proposed an optical microfiber array skin (OMAS) for object shape recognition in human-computer interaction. This OMAS utilizes a four-way longitudinal and transverse micro-nanostructure to successfully simulate human fingertip or subcutaneous multifunctional tactile receptors and realizes the synergistic effect of multi-tactile receptors in multiple tactile modalities (shown in Fig. 1(a)). To further realize the human-like multimodal touch visual reconstruction perception capability, the team integrated OMAS with the self-developed intelligent signal processing module, and simulated the human brain's processing of bioelectrical signals by using machine learning algorithms such as the Fully-Connected Neural Network-FCNN, which realized the multifunctional perception and spatial reconstruction mechanism of the object's features such as shape, hardness, surface texture, and so on (as shown in Fig. 1(b) and (c)).

Through experiments, the team demonstrated that OMAS can be used as a bionic flexible tactile skin for robots, i.e., it acts as a multifunctional tactile receptor. As shown in Fig. 2, by analyzing the static pressure data, OMAS can feel the softness, hardness and shape of the contacting objects very well (press recognition of six common objects with 100% accuracy). As shown in Fig. 3, by analyzing the characteristics of dynamic pressure tactile signals, OMAS can accurately identify the material and surface texture of the contacting objects (the recognition accuracy of ten fabrics is as high as 98.5%, and the recognition success rate of ten digits from 0-9 in international common Braille is as high as 99%). As a proof-of-concept, the team integrated OMAS into a robotic hand, successfully identified mahjong among several different objects, and realized mahjong suits recognition and reconstruction perception. This effectively verifies the vectorial tactile perception advantage of this multiphoton bionic skin, which is important to support the detection of 3D complex textures on the surface of objects, and even to realize tactile-based visualization and reconstruction perception.

The tactile receptors based on multi-photon bionic skin developed by the authors of this article have the ability to mimic human skin's ability to sense static and dynamic pressure, and are able to accurately characterize the shape, hardness, and complex texture of objects, which is applicable to the fields of smart wearable devices, robotic tactile sensing, and virtual reality, among others. This technology provides a new solution for human-computer intelligent interaction and promotes the application of tactile sensors in the field of wearable devices and robotic intelligent perception. Through further development of the multiphoton tactile skin sensing technology, the team will focus on enhancing its application capability in complex environments such as underwater and space, with a view to its wide application in the fields of mixed-reality interaction control, marine scientific research, and deep space exploration.

Keywords: multiphoton neurons / human-computer interaction / tactile sensing / tactile imaging / tactile spatial reconstruction
# # # # # #

Relying on the College of Science of National University of Defense Technology, Micronano-Optoelectronics and Intelligent Sensing Subject Group of NUDT focuses on the research of micronano-optoelectronics devices, intelligent algorithms and optical computation, biochemical sensing, human-computer interaction, extreme environment sensing, and other key technologies by taking the application needs of target identification in complex environment, new guidance mechanism, interaction control of underwater unmanned platforms, and intelligent monitoring of high-end power machinery as the traction. Over the years, the group has won many national and military scientific and technological awards under the funding of the National Key Research and Development Program, National Natural Science Foundation of China, 863 and Military Science and Technology Commission's Basic Enhancement Priority, Installation and Development Preliminary Research Priority, and applied for and was approved more than 50 national invention patents, which have been published in the Opto-Electronic Science, Laser & Photonics He has published more than 300 papers in Opto-Electronic Advances, Laser & Photonics Reviews, Advanced Optical Materials, Nanophotonics, Photonics Research, Carbon and other high-level journals, and a number of research results have entered into the top 1% and 1‰ of ESI.
# # # # # #
Opto-Electronic Advances (OEA) is a rapidly growing high-impact, open access, peer reviewed monthly SCI journal with an impact factor of 15.3 (Journal Citation Reports for IF2023). Since its launch in March 2018, OEA has been indexed in SCI, EI, DOAJ, Scopus, CA and ICI databases over the time, and expanded its Editorial Board to 34 members from 17 countries.
# # # # # #


More information: http://www.oejournal.org/oea
Editorial Board: http://www.oejournal.org/oea/editorialboard/list
All issues available in the online archive (http://www.oejournal.org/oea/archive).
Submissions to OEA may be made using ScholarOne (https://mc03.manuscriptcentral.com/oea).
ISSN: 2096-4579
CN: 51-1781/TN
Contact Us: oea@ioe.ac.cn
Twitter: @OptoElectronAdv (https://twitter.com/OptoElectronAdv?lang=en)
WeChat: OE_Journal
# # # # # #

Zhou HY, Zhang C, Nong HC et al. Multi-photon neuron embedded bionic skin for high-precision complex texture and object reconstruction perception research. Opto-Electron Adv 8, 240152 (2025). doi: 10.29026/oea.2025.240152
Zhou HY, Zhang C, Nong HC et al. Multi-photon neuron embedded bionic skin for high-precision complex texture and object reconstruction perception research. Opto-Electron Adv 8, 240152 (2025). doi: 10.29026/oea.2025.240152 
Archivos adjuntos
  • Fig. 1. Design and fabrication of a multiphoton neuron tactile skin. (a) The design concept and spatial reconstruction workflow of the multiphotonic neuron haptic skin for simulating the tactile perception and spatial reconstruction process of human subcutaneous multitactile cell fusion. (b) Flowchart of the interaction of each module in the spatial reconstruction process of mahjong by multiphotonic neuron haptic skin. (c) The structure of the multiphotonic neuron tactile skin, which consists of three layers: a silicone contact layer, an OM array embedded PDMS sensing layer, and a glass substrate layer.
  • Fig. 3. Validation of the material and surface texture recognition ability of micronized multiphoton neurons. (a) Neural network structure used for fabric recognition experiments with some signal acquisition results. (b) Accuracy change graphs of the training set and test set during the neural network training process, with the final accuracy stabilized at 99% for the training set and 98.5% for the test set. (c) Confusion matrix of ten fabric classification results in the validation set with 98.5% accuracy. (d) OMAS successfully realizes model training and online recognition of fabrics with the assistance of artificial intelligence algorithms. (e) Schematic diagram of 0-9 digit Braille. (f) Experimental process of real-time signal feedback and recognition of Braille phone numbers. (g) Real-time signal acquisition and recognition of Braille phone numbers displayed on the user graphical interface.
  • Fig. 2. Validation of shape and hardness recognition ability of multiphotonic tactile neurons. (a) Schematic of the force applied to a single sensing unit of a multiphotonic neuron. (b) Changes in the path of light passing through the waist region under three forces. (c) Stress and deformation diagrams (indicated by color bars) of the waist region of a single sensing unit under three forces. (d) Optical power response plots of a single sensing unit under 0 to 3N normal contact force in steps of 0.2N, with each force transformation held for 5 s. (e) Normal force sensitivity from 0 to 3N, with error bars indicating slight variations in optical power from the response time. (f) Individual sensing unit optical power response when pressure (1N) is repeatedly applied more than 5000 times. The inset shows the optical power response in the two time domains after zooming in. (g) Pressure recognition based on object hardness and shape for six species using support vector machine-SVM machine learning algorithm. (h) Cluster data visualization. (i) Confusion matrix of the six measured objects with 100% recognition accuracy.
07/02/2025 Compuscript Ltd
Regions: Europe, Ireland, Asia, China
Keywords: Applied science, Technology

Disclaimer: AlphaGalileo is not responsible for the accuracy of content posted to AlphaGalileo by contributing institutions or for the use of any information through the AlphaGalileo system.

Testimonios

We have used AlphaGalileo since its foundation but frankly we need it more than ever now to ensure our research news is heard across Europe, Asia and North America. As one of the UK’s leading research universities we want to continue to work with other outstanding researchers in Europe. AlphaGalileo helps us to continue to bring our research story to them and the rest of the world.
Peter Dunn, Director of Press and Media Relations at the University of Warwick
AlphaGalileo has helped us more than double our reach at SciDev.Net. The service has enabled our journalists around the world to reach the mainstream media with articles about the impact of science on people in low- and middle-income countries, leading to big increases in the number of SciDev.Net articles that have been republished.
Ben Deighton, SciDevNet
AlphaGalileo is a great source of global research news. I use it regularly.
Robert Lee Hotz, LA Times

Trabajamos en estrecha colaboración con...


  • BBC
  • The Times
  • National Geographic
  • The University of Edinburgh
  • University of Cambridge
  • iesResearch
Copyright 2025 by DNN Corp Terms Of Use Privacy Statement