Title Affiliations Abstract "ERGO-Eyehand: Ergonomic Monitoring and Improvement" "Applied Computer Science Lab" "The project aims to investigate solutions that reduce the ergonomic risk for operators. For this we want to develop the following: 1) Off and online ergonomic risk assessment methods; 2) Software to design and optimize the layout of a work cell in terms of ergonomics; 3) Control logic for adaptive collaborative robots (cobots) to reduce the ergonomic workload for operators who work with heavy workpieces. This will lead to faster ergonomic analyzes, life feedback for operators, the integration of support functions and supporting technology for assembly activities (eg a cobot)." "Faster assembly and maintenance through augmented reality" "Applied Computer Science Lab" "The overall goal of the project is to create an economically feasible user-centered Augmented Reality application methodology for flexible assembly and inspection in a low volume/high mix manufacturing environment, which (i) requires limited reprogramming effort for new product variants, (ii) results in a flexible work instruction scheme giving maximum freedom to the operators and (iii) is independent of markers for the online state estimation of the assembled product." "Smart Handling of moderate loads" "Applied Computer Science Lab" "The overall goal of the project is to develop novel smart handling system architectures for moderate payloads that would result on the design of a smart handler with following characteristics, (1) ability to handle varying payloads with minimal physical stresses on operators, (2) flexibility to handle different loads (both in weight and shape) at different reach distances. As a direct consequence, this new smart handler would drastically reduce detrimental ergonomic issues and long-term health problems due to refrainment of operators to use current cumbersome hoist systems. In several companies the absenteeism due to ergonomic-related issues exceeds 2%, which leads to both direct costs for the company as well as societal costs." "Next level mutation testing: fewer, smarter & faster (NEXT-O-TEST)." "Serge Demeyer" "Co-Design of Cyber-Physical Systems (Cosys-Lab), Antwerp Systems and software Modelling (AnSyMo)" "Software-updates are omnipresent in today's digital era and the release cycles within ICT companies are getting faster and faster. Tesla for example loads new software in its cars once every month; Amazon goes even faster and pushes changes to its servers every 12 seconds! With such fast release cycles the need for effective quality assurance is rising: software teams must take all possible steps to prevent defects from slipping into production. Today, mutation testing is the state-of-the-art technique to fully automatically assess the fault detection capacity of a software test suite. The approach is too slow for industrial adoption however. Therefore, the NEXT–O–TEST project will investigate three different ways to improve upon the state-of-the-art (fewer, smarter, and faster) to make mutation testing effective even in the presence of rapid release cycles. As such, NEXT–O–TEST will allow the NEXOR Consortium to strengthen its expertise on ""quality control and test automation"" and reinforce its position as a core lab within the Flanders Make research centre." "Computational modeling of materials: from atomistic properties to new functionalities." "Erik Neyts" "Katholieke Universiteit Leuven, Pennsylvania State University, Izmir Institute of Technology, Federal University Ceara, Shahid Rajaee University, IMEC, Catholic University of Louvain, Université d'Orléans, Ghent University, University of Liège, University of Camerino, Plasma Lab for Applications in Sustainability and Medicine - Antwerp (PLASMANT)" "In this WOG, the overarching goal is to employ existing and develop new computational methodologies at the atomistic and molecular scale to model and simulate fundamental material properties to explore and understand novel material functionalities." "Computational modeling of materials: from atomistic properties to new functionalities." "Katholieke Universiteit Leuven, Pennsylvania State University, Izmir Institute of Technology, Federal University Ceara, Shahid Rajaee University, IMEC, Catholic University of Louvain, Université d'Orléans, Ghent University, University of Liège, University of Camerino, Electron microscopy for materials research (EMAT)" "Computational modeling is an essential factor in the study of the properties of materials. Nowadays, computational modeling is extensively used to predict and develop new materials. This requires a thorough knowledge of the local atomic (structural and electronic) structure and its influence on the macroscopic properties. Although, in principle, all materials can be described with the laws of quantum mechanics, it is impossible in practice to derive all material properties from these. Even with today's most powerful supercomputers, quantum mechanical electronic structure calculations are limited to a thousand atoms and to a maximum of 1 ns. To study length and time scales that go beyond these atomic scales, (semi-) empirical techniques are used and further developed through multiscale modeling. Transitions between models describing at different time and length scales are achieved by studying the relevant scale with the appropriate computational techniques. In order to have a thorough understanding of materials properties it is therefore important for collaborations between computational groups with expertise on different methods to flourish." "The information content of dynamic cues in human sound localization." "Herbert Peremans" "Condensed Matter Theory, Engineering Management" "Understanding the workings of human sound localization, and in particular which acoustic cues we use to perceive our acoustic environment in three dimensions (3D), is not only of fundamental interest, but has become increasingly relevant in the light of nowadays advance of 3D audio displays through headphones. In the past, most research has focused on the role of static cues , i.e. when the head and source are stationary, yet it is known that localization is greatly improved if listeners are allowed to move their head during stimulus presentation. In this project, we investigate the role of dynamic cues provided by small movements of the head or source, within an information- theoretic framework. We use a proven ideal-observer model for static human sound localization and extend it to account for the dynamic acoustic cues involved. First, we study what head movements carry the most information and how this depends on the location of the source. Next, we consider the mirror situation and investigate how much information can be conveyed through small movements of the source. Finally, we study the effects on sound localization when actual head movements are not taken into account correctly, which is the case if a 3D audio display is provided through ordinary headphones. The predictions from the theoretical analysis are validated with psycho-acoustic experiments." "Simulation based testing of large scale internet of things applications." "Peter Hellinckx" "Co-Design of Cyber-Physical Systems (Cosys-Lab), Antwerp Systems and software Modelling (AnSyMo), Internet Data Lab (IDLab)" "The goal of this project is to introduce a simulation based methodology which will be used to cope with the scalability constraints of modern IoT software testing, and more specifically the testing of ultra large scale systems with emergent behavior. With IoT becoming more mainstream and with the rise in the amount of devices getting interconnected, the complexity and scale of the IoT landscape will largely increase. This interoperability between IoT devices and actuators of all sorts will prove to be vital for future IoT applications. As a result of the increased scale and diversity and because of modern decentralized IoT architectures such as Edge computing, we see that a whole new type of IoT application will gain importance. A type of application where local decentralized interaction between devices and actors will lead to a global emergent behavior. The concept of emergence can be compared to a flock of birds, where local interactions between individual birds lead to a global optimized behavior. This idea is also very relevant in IoT, imagine for example a smart traffic light application where local interactions between traffic lights could lead to a global optimized traffic flow. This type of IoT application will however lead to major difficulties with regards to application validation, testing and calibration. That is because in order for realistic emergent behavior to arise, the IoT application will need to be executed in a large-scale and diverse environment. An environment that resembles the eventual operational environment. Deploying such applications to a real-life isolated IoT testbed would be impractical as the cost of setting up such an environment at a realistic scale is too high and requires too much effort in the early stages of development. Instead of relying on expensive test beds, we propose a large scale simulation based approach. Such a simulation -based system needs to incorporate hundreds of thousands of virtual sensors interacting among each other and with the environment. The behavior of these systems will need to be modeled carefully. However, this leads to additional technical challenges. Also all virtual sensors in the system should be continuously active to interact in a real-time fashion with other systems. That is because an important part of the behavior of conventional IoT systems and EBI systems is controlled by an IoT middle-ware, the simulated entities should be able to interact with the middleware as if they were real-life IoT entities. We refer to this as software-in-the-loop (SIL) simulation. Because of this real-time requirement, a great amount of simulation entities should run in parallel which highly increases the computational complexity. Solely relying on state-of-the-art large-scale simulation techniques is insufficient. The contribution of this project is focused on the creation of a methodology for running real-time, large-scale simulations for testing and analyzing both conventional IoT systems and emergent behavior based IoT systems. We will focus on two major tracks, in the first we will reduce the computational complexity by dynamically increasing abstraction levels of simulation models and in the second track we aim at reducing network communication overhead of distributed simulations by optimizing the partitioning of simulation entities over multiple simulation servers." "Sub-Nyquist signal processing in marine radar." "Annie Cuyt" "Antwerp Maritime Academy, Computational Mathematics" "'Radar' is an acronym derived from the words Radio Detection and Ranging. It is used in maritime civil applications as a navigation aid to avoid collisions. The technique uses short bursts of an emitted electromagnetic wave at high frequency and precise direction. If an object is near the antenna transmitting these bursts, echos are sent back to the antenna. The time between the transmission and the reception of an echo is an indication of the distance between antenna and the object. The received echos (they are mainly changed in frequency and amplified) are processed and then represented on a screen. At the moment, the use of cathode ray tube screens (CRT) is less common than digital screens, so often a digitization is necessary. Therefore the analogue signal must be sampled. In order to correctly reconstruct the signal, at the receiving end the signal has to be sampled at a rate higher than the Nyquist frequency. This limit is generally accepted as a constraint for the cost and performance of radar systems in general. Making use of some recent results in exponential analysis developed at UAntwerpen, it is possible to break the Nyquist rate in signal processing. This project investigates the feasibility of these new techniques for analysing 3-dimensional echoes sampled at a sub-Nyquist rate. It is an interdisciplinary effort that joins HZS marine engineers, experienced in echo sounding, with researchers, specialised in computational mathematics, from UAntwerpen. The ambition is to achieve better performance in the use of electromagnetic pulses to detect objects at a low cost by using the most current algorithms, bypassing the investment in switching to more expensive hardware." "Load balancing and scheduling in large-scale computer systems." "Benny Van Houdt" "Internet Data Lab (IDLab)" "Since the introduction of the very first communication networks, queueing models have played a key role in improving network performance. This has resulted in a large body of queueing theory literature that has found widespread use in many other areas of science and technology. As the area of computer systems and networks is ever evolving, so is the need for new, tailored queueing models. Large-scale systems (e.g., grid computing or cloud computing) have become quite prevalent today and are often composed of many heterogeneous resources. The analysis of such large-scale heterogeneous systems using traditional queueing theory is prohibitively expensive as the required time and memory complexity tends to scale poorly in the system size. The aim of this project is to introduce and analyze new queueing models that provide insight into the performance of existing and novel load balancing and scheduling algorithms for large-scale systems. The problems under consideration include affinity scheduling problems motivated by MapReduce clusters, load balancers that make use of redundancy to mitigate latency caused by server unpredictability, and stateful load balancers. The main envisioned methodology exists in developing fluid approximations that are validated using simulation experiments and that can be shown to become exact as the system size tends to infinity. The project combines techniques from stochastic modelling, probability, dynamical systems, numerical analysis and simulation."