Contactless Sensing with Acoustic Sensing
Dr. Yuanqing Zheng is an Associate Professor in the Department of Computing, the Hong Kong Polytechnic University. He received the Ph.D. degree in CS from Nanyang Technological University in 2014. He received the B.S. degree in EE and the M.E. degree in Communication and Information System both from Beijing Normal University in 2007 and in 2010, respectively. His research interest includes Wireless Networking and Mobile Computing, Acoustic and RF Sensing, and Internet of Things (IoT). He has published research papers in journals including IEEE/ACM ToN, TMC, ToSN, and conferences including IEEE INFOCOM, ICNP, ICDCS and ACM MobiCom, MobiSys, SenSys, UbiComp, etc. He received the Best Paper Award in INFOCOM 2020 and the Best Paper Candidate in INFOCOM 2021. He serves on the editorial board of IEEE Transactions on Wireless Communications (2021-). He also serves as TPC member of conferences such as IEEE INFOCOM (2016-2022), ACM SenSys 2021, ACM/IEEE IoTDI (2021-2022).
With the flourish of the smart devices with microphones and speakers, recent works use acoustic signals to track hand movement, recognize gestures, and locate users. However, they suffer from low robustness due to interference and noise in practice. Besides, the performance of neural network models will degrade if there is no sufficient amount of data to train the models. In this talk, we will introduce our latest work on acoustic gesture recognition and head orientation estimation for smart devices. In particular, we will introduce RobuCIR which recognizes hand gestures with acoustic signals, and HOE which estimates user’s facing orientation with microphone arrays.