Efficient and flexible interactions require precisely converting human intentions into computer-recognizable signals,which is critical to the breakthrough development of metaverse.Interactive electronics face common d...Efficient and flexible interactions require precisely converting human intentions into computer-recognizable signals,which is critical to the breakthrough development of metaverse.Interactive electronics face common dilemmas,which realize highprecision and stable touch detection but are rigid,bulky,and thick or achieve high flexibility to wear but lose precision.Here,we construct highly bending-insensitive,unpixelated,and waterproof epidermal interfaces(BUW epidermal interfaces)and demonstrate their interactive applications of conformal human–machine integration.The BUW epidermal interface based on the addressable electrical contact structure exhibits high-precision and stable touch detection,high flexibility,rapid response time,excellent stability,and versatile“cut-and-paste”character.Regardless of whether being flat or bent,the BUW epidermal interface can be conformally attached to the human skin for real-time,comfortable,and unrestrained interactions.This research provides promising insight into the functional composite and structural design strategies for developing epidermal electronics,which offers a new technology route and may further broaden human–machine interactions toward metaverse.展开更多
Human–machine interactions using deep-learning methods are important in the research of virtual reality,augmented reality,and metaverse.Such research remains challenging as current interactive sensing interfaces for ...Human–machine interactions using deep-learning methods are important in the research of virtual reality,augmented reality,and metaverse.Such research remains challenging as current interactive sensing interfaces for single-point or multipoint touch input are trapped by massive crossover electrodes,signal crosstalk,propagation delay,and demanding configuration requirements.Here,an all-inone multipoint touch sensor(AIOM touch sensor)with only two electrodes is reported.The AIOM touch sensor is efficiently constructed by gradient resistance elements,which can highly adapt to diverse application-dependent configurations.Combined with deep learning method,the AIOM touch sensor can be utilized to recognize,learn,and memorize human–machine interactions.A biometric verification system is built based on the AIOM touch sensor,which achieves a high identification accuracy of over 98%and offers a promising hybrid cyber security against password leaking.Diversiform human–machine interactions,including freely playing piano music and programmatically controlling a drone,demonstrate the high stability,rapid response time,and excellent spatiotemporally dynamic resolution of the AIOM touch sensor,which will promote significant development of interactive sensing interfaces between fingertips and virtual objects.展开更多
基金supported by National Natural Science Foundation of China(52202117,52232006,52072029,and 12102256)Collaborative Innovation Platform Project of Fu-Xia-Quan National Independent Innovation Demonstration Zone(3502ZCQXT2022005)+3 种基金Natural Science Foundation of Fujian Province of China(2022J01065)State Key Lab of Advanced Metals and Materials(2022-Z09)Fundamental Research Funds for the Central Universities(20720220075)the Ministry of Education,Singapore,under its MOE ARF Tier 2(MOE2019-T2-2-179).
文摘Efficient and flexible interactions require precisely converting human intentions into computer-recognizable signals,which is critical to the breakthrough development of metaverse.Interactive electronics face common dilemmas,which realize highprecision and stable touch detection but are rigid,bulky,and thick or achieve high flexibility to wear but lose precision.Here,we construct highly bending-insensitive,unpixelated,and waterproof epidermal interfaces(BUW epidermal interfaces)and demonstrate their interactive applications of conformal human–machine integration.The BUW epidermal interface based on the addressable electrical contact structure exhibits high-precision and stable touch detection,high flexibility,rapid response time,excellent stability,and versatile“cut-and-paste”character.Regardless of whether being flat or bent,the BUW epidermal interface can be conformally attached to the human skin for real-time,comfortable,and unrestrained interactions.This research provides promising insight into the functional composite and structural design strategies for developing epidermal electronics,which offers a new technology route and may further broaden human–machine interactions toward metaverse.
基金supported by National Natural Science Foundation of China under Grants (U1805261 and 22161142024)A~*STAR SERC AME Programmatic Fund (A18A7b0058)
文摘Human–machine interactions using deep-learning methods are important in the research of virtual reality,augmented reality,and metaverse.Such research remains challenging as current interactive sensing interfaces for single-point or multipoint touch input are trapped by massive crossover electrodes,signal crosstalk,propagation delay,and demanding configuration requirements.Here,an all-inone multipoint touch sensor(AIOM touch sensor)with only two electrodes is reported.The AIOM touch sensor is efficiently constructed by gradient resistance elements,which can highly adapt to diverse application-dependent configurations.Combined with deep learning method,the AIOM touch sensor can be utilized to recognize,learn,and memorize human–machine interactions.A biometric verification system is built based on the AIOM touch sensor,which achieves a high identification accuracy of over 98%and offers a promising hybrid cyber security against password leaking.Diversiform human–machine interactions,including freely playing piano music and programmatically controlling a drone,demonstrate the high stability,rapid response time,and excellent spatiotemporally dynamic resolution of the AIOM touch sensor,which will promote significant development of interactive sensing interfaces between fingertips and virtual objects.