Education

Courses

  1. Conversational design with multi-modalities (TU/e, 2023–2024)
    DCM220. Focus on multi-modal feedback for automotive.

  2. Data analytics for engineers (TU/e, 2023–2024)
    2IAB0. Basics of data analytics with Python.

  3. Automotive human factors (TU/e, 2022–)
    0HM310. Exploration of relation between a car or truck, its human driver, and the dynamic environment.

Guest lectures

  1. Guest lectures during Human-Machine Systems course lead by Dr. Joost de Winter at TU Delft (2015).

Supervision of PhD students

  1. Shadab Alam (2023)
    Connected data from automated vehicles and road users for traffic safety.

  2. Rutger Verstegen (2023)
    Uncertainty in interaction between automated vehicles and other road users.

  3. Roulin Gao (2021)
    Data-driven assessment of driving behaviour of young and elderly drivers.

Supervision of MSc students and student groups

  1. Peihang Li (2024)
    Characterizing situational awareness of transport operators based on eye tracking.

  2. Job van Houten (2024)
    Relation between trust and eye gaze behaviour of the driver of an automated vehicle.

  3. Alexandra van Dijk (2024) – M2.1 project within Future Mobility squad
    Using generative AI for person-based traffic research.

  4. Xingjian Zeng (2024) – within Future Mobility squad
    In-vehicle animal-inspired agent.

  5. Thomas Marinissen (2024) – within Future Mobility squad
    Control of UI in a car by voice.

  6. Xander Fanchamps, Giovanni Sapienza, Thomas van Heuvelen, Matin Maskitou (2024) – within Future Mobility squad
    Increasing situational awareness by enhancing the experience of non-intrusive information flow while engaging in NDRT in level 3+ automated vehicles.

  7. Yuanzi Wang (2023)
    A real road study of automated driving: The influence of the non-critical cue on automation surprise (link).

  8. Thomas Marinissen (2023) – M2.1 project within Future Mobility squad
    Exploring how to teach multimodal interaction in cars to first-time users.

  9. Tom Bergman, Ariën Helder, Emir Kadrić, Yuanyuan Xu (2023) – within Future Mobility squad
    Emergency vehicle assistant for level 3 automated vehicle (link).

  10. Alexandra van Dijk (2023) – M1.2 project within Future Mobility squad
    Reducing stress using heart-rate and eye gaze data (link).

  11. Lokkeshver Kumaaravelu (2022)
    Design of a real-time computer-vision and gaze-based pedestrian warning system (link).

  12. Iva Surana (2022)
    Stability in truck driving behaviour (link).

  13. Noureddine Begga (2022)
    Looking behaviour of pedestrians.

  14. Lauren Ickenroth (2021)
    Driver-passenger synchrony: The identification of synchrony in head orientation and movement (link).

  15. Johnson Mok (2021)
    A two-agent VR study: The effects of driver eye gaze visualisation on AV-pedestrian interaction (link).

  16. Jim Hoogmoed (2021)
    Human driver risk perception model (link).

  17. Daniel van den Haak (2021)
    Automated lane changing using deep reinforcement learning: A user-acceptance case study (link).

  18. Bram Kooijman (2021)
    The identification of factors affecting drivers’ perceived risk in pedestrian-vehicle interaction: A crowdsourcing study (link).

  19. Lokin Prasad (2021)
    Identifying lane changes automatically using the GPS sensor of portable devices (link).

  20. Vishal Onkhar (2020)
    Algorithmic detection of eye contact in driver-pedestrian interactions (link).

  21. Anirudh Sripada (2020)
    Using an automated vehicle’s lateral deviation to communicate vehicle intent to the pedestrian (link).

  22. Max Oudshoorn (2020)
    Bio-inspired intent communication for automated vehicles (link).

  23. Leo van der Eijk (2019)
    Auditory feedback for communication between a vehicle and a pedestrian.

  24. Tom Driessen (2019)
    Feeling uncertain: Effects of encoding uncertainty in the tactile communication of a spatiotemporal feature (link).

Supervision of BSc students and student groups

  1. Jochem Verstegen (2023) – within Future Mobility squad
    Slidable bicycle handle for AV interaction (link).

  2. Wouter Salden, Jakub Woziwodzki, Axel Reitz (2023) – within Future Mobility squad
    Designing an emergency drone (link).

  3. Jesper Kapteijns (2022) – within Future Mobility squad
    eHMI inspired by movement of liquid (link).

  4. Matin Maskitou (2022) – within Future Mobility squad
    Light-based presentation of objects around automated vehicle (link).

  5. Van Oeveren, C., Kroese, J., Brockhoff, L., Van Genderen, R. (2020)
    Communication between automated vehicles and pedestrians: A standalone external Human Machine Interface (link).

  6. Hopmans, B., Wesdorp, D., Visscher, J., De Vlam, V. (2019)
    Visual scanning behaviour in a parking lot (link).

  7. Mallant, K. P. T., Roosens, V. E. R., Middelweerd, M. D. L. M., Overbeek, L. D. (2019)
    The working of a directional external Human-Machine Interface in near-collision tested with a coupled simulator (link).

  8. Bijker, L., Dielissen, T., French, S., Mooijman, T., Peters, L. (2018)
    Blind driving by means of the predicted track angle error (link).

  9. Heisterkamp, N., Haddou, A., Luik, P., Klevering, S., Zult, M. (2017)
    Measuring eye movements in a GTA V cycling simulation (link).

  10. Brunt, A., Schijf, M., Schrader, E., Vroom, N. (2017)
    The effects of auditory feedback in a racing simulator on a car’s slip angle relative to the ideal slip angle (link).

  11. De Koning, C., Lingmont, H., De Lint, T., Van den Ouden, F., Van der Sijs, T. (2017)
    Comparing an intelligent and a conventional headway-based auditory feedback system on safety and acceptance in an on-road car following experiment (link).

  12. Van der Aa, A., Spruit, J., Schoustra, M., Staats, L., Van der Vlist, K. J. (2016)
    Directionalization and classification of motorized vehicles using a smartphone (link).

  13. Van Haarlem, W., Quraishi, H., Berssenbrugge, C., Binda, J. (2015)
    Detection of objects by means of sound (link).

  14. Al Jawahiri, A., Kapel, P., Mulckhuyse, J., Wagenaar, S. (2015)
    Oculus rift: Does it improve depth perception? (link).

  15. Van der Geest, L., Van Leeuwen, S., Numan, B., Pijnacker, J. (2015)
    Blind driving by means of auditory feedback (link).

  16. Beaumont, C., Van der Geest, X., De Jonge, R., Van der Kroft, K. (2015)
    Blind driving by means of a steering-based predictor algorithm (link).

  17. De Jager, M., Struijk, J., Sleegers, D., Skenderi, N. (2014)
    Navigation by auditory feedback (link).

Supervision of internships

  1. Job van Houten (2023)
    Relation between trust and eye gaze behaviour of the driver of an automated vehicle.

  2. William Yen (2023)
    Flexible electronic circuits.

  3. Lena Siegling (2020)
    A naturalistic pilot study of cyclists’ eye-and head movement using head-mounted eye tracking (link).

Defence committees

  1. Dylan van Oosterhout (MSc, 2024) – at TU Eindhoven
    Exploring how natural patterns facilitated through cymatics can be utilized to create a calming and relaxing atmosphere within the living room context.

  2. Jeroen Brattinga (MSc, 2024) – at TU Eindhoven
    Imagination at work: Alternative way of having conversations in organisational setting.

  3. Derck Chu (MSc, 2023) – at TU Eindhoven
    ARive: Assisting drivers with in-car augmented reality for risk zone detection (link).

  4. Emmie Knoester (MSc, 2023) – at TU Eindhoven
    The research, design, and development of a city monitor for the municipality of Eindhoven.

  5. Davanshi Bansal (MSc, 2023) – at TU Eindhoven
    Guiding framework for automotive HMI developers to make technologies experienceable to users (link).