Title Participants Abstract "Substitute Buttons: Exploring Tactile Perception of Physical Buttons for Use as Haptic Proxies" "Bram VAN DEURZEN, Gustavo ROVELO RUIZ, Daniël BOT, Davy VANACKEN, Kris LUYTEN" "Buttons are everywhere and are one of the most common interaction elements in both physical and digital interfaces. While virtual buttons offer versatility, enhancing them with realistic haptic feedback is challenging. Achieving this requires a comprehensive understanding of the tactile perception of physical buttons and their transferability to virtual counterparts. This research investigates tactile perception concerning button attributes such as shape, size, and roundness and their potential generalization across diverse button types. In our study, participants interacted with each of the 36 buttons in our search space and provided a response to which one they thought they were touching. The findings were used to establish six substitute buttons capable of effectively emulating tactile experiences across various buttons. In a second study, these substitute buttons were validated against virtual buttons in VR. Highlighting the potential use of the substitute buttons as haptic proxies for applications such as encountered-type haptics." "AntHand: Interaction Techniques for Precise Telerobotic Control Using Scaled Objects in Virtual Environments" "Dries CARDINAELS, Bram VAN DEURZEN, Raf RAMAKERS, Kris LUYTEN" "This paper introduces AntHand, a set of interaction techniques for enhancing precision and adaptability in telerobotics through the use of scaled objects in virtual environments. AntHand operates in three phases: up-scaling interaction, for detailed control through a magnified virtual model; constraining interaction, which locks movement dimensions for accuracy; and post-editing, allowing manipulation trace optimization and noise reduction. Leveraging a use-case related to surgery, the application of AntHand is showcased in a scenario demanding high accuracy and precise manipulation. AntHand demonstrates how collaboration between humans and robots can improve precise control of robot actions in telerobotic operations, while maintaining the familiar use of traditional tools, rather than relying on specialized controllers." "AR Guidance Design for Line Tracing Speed Control" "Jeroen CEYSSENS, Bram VAN DEURZEN, Gustavo ROVELO RUIZ, Kris LUYTEN, Fabian DI FIORE" "In many jobs, workers execute precise line tracing tasks; welding, spray painting, or chiseling, for example. Training and support for such tasks can be done using VR and AR. However, to enable workers to achieve the required precision in movement and timing, the effect of visual guidance on continuous movement needs to be explored. In VR environments, we want to ensure people are trained so that the obtained skill is transferable to a real-world context, whereas, in AR, we want to ensure an ongoing task can be completed successfully when adding visual guidance. To simulate these various contexts, we employ a VR environment to investigate the effectiveness of different visualizations for motion-based guidance in a line tracing task. We tested five different visualizations, including faster and slower arrows on the pen, the same arrows on the line, a dynamic graph on the pen or line, and a ghost object to follow. Each visualization was tested with the same set of five lines of different target speeds (2cm/s to 10 cm/s in steps of 2 cm/s) with a training line of 5 cm/s. Our results show that the example ghost on the line turns out to be the most efficient visualization for allowing users to achieve a specific speed. Users also perceived this visualization as the most engaging and easy to use. These findings have significant implications for the development of AR-based guidance systems, specifically in the realm of speed control, across diverse domains such as industrial applications, training, and entertainment." "Demonstrating History in Motion: Interactive 3D Animated Visualizations for Understanding and Exploring the Modeling History of 3D CAD Designs" "Tom VEUSKENS, Raf RAMAKERS, Danny LEEN, Kris LUYTEN" "History in Motion (HiM) is an interactive visualization tool that enables CAD designers to interactively explore the design history of 3D CAD models. In contrast to manually exploring the modeling history of a CAD project, designers can select geometry elements to find relevant modeling features in HiM. These modeling features are then explained to designers using a novel 3D interactive animation that visualizes how the modeling features interact, and are used on top of the CAD model, to realize the selected geometry. A control panel in HiM allows for a deeper exploration of the modeling features, with shortcuts for making modifications. During this demonstration, attendees can experiment with HiM on a variety of CAD designs and explore their design histories." "Measurement Patterns: User-Oriented Strategies for Dealing with Measurements and Dimensions in Making Processes" "Raf RAMAKERS, Danny LEEN, Jeeeun Kim, Kris LUYTEN, Steven Houben, Tom VEUSKENS" "The majority of errors in making processes can be tracked back to errors in dimensional specifications. While technical aspects of measurement, such as precision and speed have been extensively studied in metrology, the user aspects of measurement received significantly less attention. While little research exists that specifically addresses the user aspects of handling dimensions, various systems have been built that embed new interactive modalities, processes, and techniques which significantly impact how users deal with dimensions or conduct measurements. However, these features are mostly hidden in larger system contributions. To uncover and articulate these techniques, we conducted a holistic literature survey on measurement practices in crafting techniques and systems for rapid prototyping. Based on this survey, we contribute 10 measurement patterns, which describe reusable elements and solutions for common difficulties when dealing with dimensions throughout workflows for making physical artifacts." "History in Motion: Interactive 3D Animated Visualizations for Understanding and Exploring the Modeling History of 3D CAD Designs" "Tom VEUSKENS, Raf RAMAKERS, Danny LEEN, Kris LUYTEN" "We present History in Motion (HiM), an interactive visualization tool that enables CAD designers to interactively explore the design history of 3D CAD models. In contrast to manually exploring the modeling history of a CAD project, HiM finds relevant modeling features for geometry elements selected by the designer. We contribute a novel 3D interactive animation that visualizes how the modeling features interact, and are used on top of the CAD model, to realize the geometry. A control panel allows for a deeper exploration of the modeling features, with shortcuts for making modifications." "FortClash: Predicting and Mediating Unintended Behavior in Home Automation" "Sven COPPERS, Davy VANACKEN, Kris LUYTEN" "Context-Aware Support of Dexterity Skills in Cross-Reality Environments" "Jeroen CEYSSENS, Fabian DI FIORE, Kris LUYTEN" "Figure 1: Overview of the developed XR prototypes including (1) an AR tool for operator assistance, (2) an AR tool providing surface coverage for cleaning cleanrooms, (3) a VR simulation of nuclear environments, and (4) a welding tool using AR passthrough to simulate welding light, seam, heat and guidance for direction. ABSTRACT Within our work, we apply context-awareness to determine how AR/VR technology should adapt instructions based on the context to suit user needs. We focus on situations where the user must carry out a complex manual activity that requires additional information to be present during the activity to achieve the desired result. To this end, the emphasis is on activities that require fine-motor skills and in-depth expertise and training, for which XR is a powerful tool to support and guide users performing these tasks. The contexts we detect include user intentions, environmental conditions, and activity progressions. Our work builds on these contexts with the main focus on determining how XR should adapt for the end-user from a usability perspective. The feedback we request from ISMAR consists of input in detection, usability, and simulation categories, together with how to balance these categories to create real-time and user-friendly systems. The next steps of our work will consider how to content should adjust based on the cognitive load, activity space, and environmental conditions. Index Terms: Human-centered computing-Interaction design-Interaction design process and methods-Activity centered design Human-centered computing-Human computer interaction (HCI)-Interaction paradigms-Mixed / augmented reality 1 RELATED WORKS ""Context awareness"" is defined by The Oxford Dictionary of Computing as ""The ability of a computer system to sense details of the external world and choose its course of action depending on its findings."" 1. Due to the nature of ""sensing the details of the external world"", context-awareness has been studied extensively in combination with Cross-Reality (XR) technology, like Augmented Reality (AR) and Virtual Reality (VR), which interact heavily with the external world. These topics expand from improving assistance instructions by adjusting them to the operation [5,9,20], by blending the content with the real environment by taking into account the 1 Oxford Dictionary of Computing definition of context awareness: https://www.oxfordreference.com/view/10.1093/oi/authority.20110803095634835 lighting and shadows [1, 8, 11], and by making the content adaptive to situations [10, 13]. Within these topics, Lindlbauer et al. (2019) used context-awareness to automatically change the amount of XR content shown based on the user's cognitive load and knowledge [10]. To further support the use of context-awareness in other XR applications, Chen et al. (2018, 2020) created frameworks for context-aware ubiquitous interaction [3] and for semantic-based material-aware interaction with the real environment [4]. Similarly, Gatullo et al. (2020) created a context-aware information manager for AR-provided technical documentation [6], and Wang et al. (2020) made a tool for authoring context-aware applications by utilizing programming-by-demonstration of daily activities [17]. Aside from object-based context-awareness, Orlosky et al. (2015) developed a management tool that automatically realigns AR content to avoid occlusion with real people in the environment [13]. All these systems allow content creators to quickly develop new context-aware XR content and interfaces that will adjust correctly for different situations. However, to achieve proper immersion in simulations, XR content also needs to adapt its visualization to the changing environment. To achieve this effect, Barreira et al. (2018) studied how to adapt the shadows of virtual objects based on the lighting present in real-life outdoor environments [1]. Meanwhile, Mandl et al. (2017) created a deep learning model for changing the material reflections of digital content to match that of the lighting learned from the real environment [11]. Kan et al. (2019) expanded further with a deep learning model for estimating light reflections in the real environment to simulate shadows and transparency of digital objects [8]. For our work, we will attempt to simulate outcomes from operations requiring dexterity skills. These simulations must blend in with the real environment as they would during live procedures to acquire correct study results. Aside from simplifying XR content creation and blending real and digital objects, it is also necessary to study how context-aware visualizations should behave in different circumstances. Lampen et al. (2020) studied how context-aware assistance can be provided in the automotive industry by providing an augmented human for support and found some initial benefits in user-experience, motivation, and performance [9]. Doughty et al. (2021) built a deep learning model for detecting surgical operations and provided context-aware guidance on the surgery operations to see how they can support 954 2022 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)" "Choreobot: A Reference Framework and Online Visual Dashboard for Supporting the Design of Intelligible Robotic Systems" "Bram VAN DEURZEN, Herman Bruyninckx, Kris LUYTEN" "As robots are equipped with software that makes them increasingly autonomous, it becomes harder for humans to understand and control these robots. Human users should be able to understand and, to a certain amount, predict what the robot will do. The software that drives a robotic system is often very complex, hard to understand for human users, and there is only limited support for ensuring robotic systems are also intelligible. Adding intelligibility to the behavior of a robotic system improves the predictability, trust, safety, usability, and acceptance of such autonomous robotic systems. Applying intelligibility to the interface design can be challenging for developers and designers of robotic systems, as they are expert users in robot programming but not necessarily experts on interaction design. We propose Choreobot, an interactive, online, and visual dashboard to use with our reference framework to help identify where and when adding intelligibility to the interface design is required, desired, or optional. The reference framework and accompanying input cards allow developers and designers of robotic systems to specify a usage scenario as a set of actions and, for each action, capture the context data that is indispensable for revealing when feedforward is required. The Choreobot interactive dashboard generates a visualization that presents this data on a timeline for the sequence of actions that make up the usage scenario. A set of heuristics and rules are included that highlight where and when feedforward is desired. Based on these insights, the developers and designers can adjust the interactions to improve the interaction for the human users working with the robotic system." "Engineering Interactive Computing Systems 2022: Editorial Introduction" "Kris LUYTEN, Philippe Palanque, Aaron John Quigley, Marco Winckler"