Workshop program

Description Presenter Title
11:30 Welcome and workshop plan Arthur Sluÿters and Klen Čopič Pucihar Welcome and workshop plan
11:45
12:00 Session 1: Radar based interaction Aaron Quigley (remote) Radar and Camera Based Fine-grained Surface Context Awareness
12:15
12:30 Ryo Hajika, Tamil Selvan, Yun Suen Pai (remote) RadarHand: A Wrist-Worn Radar for On-Skin Touch-Based Proprioceptive Gestures
12:45
13:00 Lunch Break    
13:15    
13:30    
13:45    
14:00 Session 2: Datasets and gesture interaction Arthur Sluÿters (on-site) Mogi: Multi-Orientation Gesture Interaction with a 140GHz High-Resolution On-Chip Radar
14:15 Nuwan T. Attygalle (on-site) Comparative Testing of Radar Signal Representations when Sensing Though Materials
14:30 Discussion  
14:45  
15:00 Session 3: Engineering systems Moeness Amin (remote) A Hand Air-Writing System using MIMO Radar and Deep Learning
15:15
15:30 Matjaž Kljun (on site) Digital Signal Processing Tool for Radar-Based Human-Computer Interaction
15:45 Arthur Sluyters (on-site) zeroG: Towards an Integrated Development Environment for Deploying Radar-based Gesture User Interface
16:00 Discussion  
16:15  
16:30 Evening Coffee Break    
16:45    
17:00 Datasets and prototypes Klen Čopič Pucihar Publically available datasets in Radar-Based HCI
17:15 Open session Prototypes & Data collection
17:30
17:45 Dissemination Open session General challenges Systematic literature review Book & joint publication
18:00
18:15
18:30
18:45 Wrap-up    

Radar and Camera Based Fine-grained Surface Context Awareness

Invited talk: Aaron Quigley

The exploration of novel sensing to facilitate new interaction modalities is an active research topic in Human-Computer Interaction. Across the breadth of HCI, we can see the development of new forms of interaction underpinned by the appropriation or adaptation of sensing techniques based on the measurement of sound, light, electric fields, radio waves, biosignals etc. In this talk, Professor Quigley will delve into a range of novel radar based interactions with people use footwear for walking, running, or exercise.

List of papers prested in this invited talk available here.

RadarHand: A Wrist-Worn Radar for On-Skin Touch-Based Proprioceptive Gestures

Invited talk: Ryo Hajika, Tamil Selvan, Yun Suen Pai

RadarHand is a wrist-worn wearable with millimeter wave radar that detects on-skin touch-based proprioceptive hand gestures. Throughout the series of studies, the researchers 1) evaluated the proprioceptive and tactile perception nature of the back of the hand, 2) trained deep-learning models for gesture classification, and 3) evaluated RadarHand’s performance in real-time under two interaction models: Active interaction and Reactive interaction. In this presentation, the researchers introduce the three research studies and discuss the implications of RadarHand for gesture recognition and directions for future works.

Mogi: Multi-Orientation Gesture Interaction with a 140GHz High-Resolution On-Chip Radar

Workshop paper: Maxim Rykunov, André Bourdoux, Hichem Sahli, Klaas Bombeke, Arthur Sluÿters aND Sébastien Lambot

End-users are today used to interacting with interaction surfaces of various orientations, either graphically with a mouse or pointer, or tactilely with a finger. They are also increasingly used to interacting gesturally with surfaces in multiple orientations, ranging from horizontal and oblique surfaces to vertical surfaces, most often using mid-air gestures. They now want to get used to interacting in the same way with any interface surface involving a radar, which has the advantage of being insensitive to conditions of lighting, visibility, and privacy. This paper motivates radar gesture interaction in front of interactive surfaces with multiple orientations and aims to compare the accuracy of gesture recognition in different orientations based on three categories of gesture candidates: lateral, horizontal, and vertical.

Comparative Testing of Radar Signal Representations when Sensing Though Materials

Workshop paper: Nuwan T. Attygalle, Matjaž Kljun, Una Vuletić and Klen Čopič Pucihar

In recent years, gesture recognition using miniaturised radar sensing has attracted attention in both academia and industry. The ability to sense mid-air gestures with miniaturised radars through non-conductive materials with high spatial resolution opens up new opportunities for interacting with systems embedded with such radar technology. Examples include radars embedded in wearables, cars’ dashboard, smart furniture, and other objects within smart environments. However, a deeper understanding of how different occluding materials impact gesture recognition performance is needed. Previous studies have primarily focused on evaluating only one type of radar signal representation, despite the fact that several other representations exist and proved effective. One way to better understand the aforementioned impact of the occluding materials on the radar signal is thus to compare different radar signal representations. To this end, this paper conducts a comparative testing of four (4) signal representations (In-phase and Quadrature (IQ) representations in time and frequency domain, Range-Angles and Range Doppler) in order to evaluate their robustness against signal distortions caused by occluding materials. The preliminary results show that the performance increases with higher transmission coefficient and that compared to IQ, Range-Doppler and Range-Angles signal representations are significantly more robust against distortions caused by occluding materials.

A Hand Air-Writing System using MIMO Radar and Deep Learning

Invited talk: Moeness Amin

Recently, radar-based hand gesture recognition (HGR) has gained increased attention in several applications involving contactless human-machine interaction (HMI). Air-writing, as a definite HGR, requires the real-time target positioning and trajectory tracking, followed by alphanumerical recognition via deep learning. Benefiting from the large virtual array provided by Multiple-Input Multiple-Output (MIMO) radar, this work first proposes an interferometry-based processing method to acquire the subtle range and azimuth displacements of finger motions, thus fulfilling the tracking of the alphanumerical trajectory. A ResNet50 convolution neural network (CNN) trained with the trajectory is used to recognize the correct writings. Additionally, spatial interferometry is exploited to identify multiple strokes when writing complicated characters, punctuations or words. This is achieved by utilizing the subtle elevation change induced by hand lifts among different strokes. Experimental results show that the proposed air-writing system performs rather well in sensing and tracking hand movements, achieving an averaged recognition accuracy of over 95% for different kinds of air-writings.

DiSiPro: Digital Signal Processing Tool for Radar-Based Human-Computer Interaction

Workshop paper: Nuwan T. Attygalle, Matjaž Kljun and Klen Čopič Pucihar

The development of radar-based human-computer interaction systems using miniature radar-on-chip sensors has attracted significant interest. This interest is fuelled by the availability of affordable radar chips and advancements in signal processing and machine learning that improve radar signal interpretation accuracy. However, several challenges remain, particularly when we want to compare different radar-based gesture interaction systems. On dimension to compare different systems can be based on radar signal representations, since raw voltage data can be used to extract these. Different signal representations include among others range-Doppler, range-Angle, and point clouds. Existing research often limits comparative testing to the same radar signal representation, focusing mainly on gesture recognition algorithms or minimal variations within digital signal processing pipelines. In order to fill this gap we designed and created an open source tool that enables fast and reliable dataset preparation for comparative testing of radar signal representations. The open-source tool enables visualisation of different radar signal representations, and includes a command-line interface for batch processing in order to streamline dataset preparations.

zeroG: Towards an Integrated Development Environment for Deploying Radar-based Gesture User Interface

Workshop paper: Arthur Sluyters and Mehdi Ousmer

Despite advancing at a tremendous pace recently, few real-world applications have stemmed from research on radar-based gesture interaction. This phenomenon can be explained by the complexity of integrating signal processing techniques and gesture recognition algorithms into user-friendly applications, especially for developers lacking expertise in radar-based gesture interaction. In response, this paper introduces zeroG, a software framework concept aimed at streamlining the development of radar-based gesture interfaces. Its graphical user interface will en- able developers to assemble standardized modules into complex gesture recognition dataflows, facilitating both testing and application development. By introducing a clear separation of concerns between application frontend and gesture recognition, zeroG will enable developers to effortlessly adapt existing dataflows to new applications or sensors, without requiring extensive experience in (radar-based) gesture recognition. This paper explores the current landscape of tools for creating radarbased gesture interfaces, introduces the core elements of the zeroG framework, and outlines potential challenges.

Publicly available datasets in Radar-Based HCI

Nuwan T. Attygalle, Arthur Sluÿters, Matjaž Kljun, Klen Čopič Pucihar