Our collaboration paper with Professor Gastner‘s team (Yale-NUS) was accepted earlier this week and will appear in IEEE Transactions on Visualization and Computer Graphics (TVCG) very soon!
The focus of the work is on cartograms, we evaluated different mechanisms to help people process information from cartograms, which are maps on which statistical information is shown in diagrammatic form. Specifically, we evaluated three different mechanisms to help users process information from cartograms: linked brushing, tooltips and animations. Overall, these mechanisms are helpful for different kind of tasks, and a general guideline would be to offer them all for any interactive cartograms.
The project behind “It’s all in the timing” started back in 2017, as a student capstone (final year) project in collaboration with Assistant Professor Christopher Asplund (Psychology). After a year of data gathering, we submitted it to CHI where the paper got good feedback. The paper ultimately did not make it as it did not clearly highlight actionable/usable results from the studies. We thus resubmitted to ToCHI where we got excellent comments and suggested improvements.
One interesting aspect of the project is its multidisciplinary aspect, and how we (as HCI researchers) can work with and learn from psychology researchers.
In case you did not do it yet, please have a look at the paper either on arXiv or ACM DL.
In our paper “(M)ad to see me? Intelligent Advertisement Placement: Balancing User Annoyance and Advertising Effectiveness”, we propose Perceptive Ads as an adaptive ad placement strategy that strategically determines the presentation, the placement and the behaviours of ads based on the detected level of user interactivity with the app as well as the level of disruption impact that influence the task quality. The idea is to place a minimal presentation of an ad or its segments in regions where they are salient, but do not hamper user interaction with the app, and to design user-ad interactions with the user’s current task in mind.
A preprint of the paper is available by clicking here.
We are very proud to announce that our paper “Investigating Performance and Usage of Input Methods for Soft Keyboard Hotkeys” has been conditionally accepted to Mobile HCI 2020.
In that work, we formalized the concept of Soft Keyboard Hotkeys, which are shortcuts displayed on soft keyboards (on phone or tablet). In the paper, we especially focus on three different input methods to activate them: Swipe, Tap+Tap (Once), Hold (User Maintained).
In our first experiment, we compared the performance of these methods in one and two-handed scenarios, and included different devices (Phone, Tablet) and Orientation (Landscape, Portrait) and found out that Once (Tap+Tap) was usually the fastest input method. In a second experiment, users were tested on Phone and Tablet in different mobility conditions. We allowed them to use any of the three methods anytime and observed that users tended to primarily use Once, while relying secondarily on Swipe on Phone and User Maintained on Tablet.
A pre-print version of the paper is available here.
The first one, Pose Estimation for Facilitating Movement Learning from Online Videos presents a system that allows people to workout from home with video tutorials (e.g. YouTube videos). In this paper, we investigated which graphical elements of the interface may allow users to follow tutorials closely and correct pose errors.
The second paper, Vibrotactile Feedback for Vertical 2D Space Exploration presents a wearable device that may help people located objects in a 2D vertical plane in front of them. In this research, we decided to encode X and Y coordinates/informations using either continuous or discrete encoding, and found out that the most efficient way to convey positional information is to use discrete encoding on the Y axis and continuous encoding on the X axis. These results are in continuation of the work we did on digital map exploration and assistive technologies.
In “It’s all in the timing”, we investigated the effect of vibrotactile and audio unexpected events on vibrotactile perception, i.e. how an unexpected sound or vibration prior to a vibration pattern may affect our perception of said pattern.
In most cases, we found a negative effect of these events on perception, however, in some cases, depending on the precise timing of the event, we found a small improvement on perception, as the event allows user to focus more on the upcoming pattern.
This work was a collaboration with Christopher Asplund (Yale-NUS) and his team, and was originally Parag Bhatnagar’s final year project.
The paper will be published soon, but a preprint can be found on arxiv.
Our lab got two papers accepted at CHI 2020. The first paper is called “Nudge for Deliberativeness: How Interface Features Influence Online Discourses” and explores how simple interface changes may push people to create higher quality responses when posting online.
For the second paper ‘”You Cannot Offer Such a Suggestion”: Designing for Family Caregiver Input in Home Care Systems’, our goal was to allow caregivers to provide input to occupational therapists to improve living conditions of their loved ones and allow them to stay at home longer.
1. Sanju Menon, Weiyu Zhang, and Simon Perrault. 2020. Nudge for Deliberativeness: How Interface Features Influence Online Discourses. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’20). [Download paper]
2. Pin Sym Foong, Charis Anne Lim, Joshua Wong, Chang Siang Lim, Simon Tangi Perrault, and Gerald Huat Choon Koh. 2020. “You Cannot Offer Such a Suggestion”: Designing for Family Care- giver Input in Home Care Systems. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’20).