Hello, world!
I'm Kayhan [kʲejhɒːn]
🔈 Latifzadeh. I studied Computer Engineering at the University of Guilan, Rasht, Iran.
I then
pursued a Master's degree in Artificial Intelligence and Robotics at SRTTU, Tehran, Iran. Currently, I am a PhD Candidate in
Computer
Science at the Computational
Interaction (COIN) group (under the supervision of Prof. Luis A. Leiva) at the University
of
Luxembourg, Luxembourg. My research focuses on decoding physiological signals in various
contexts of
Human-Computer Interaction.
So far, I have been using electroencephalogram (EEG) signals, eye and head movements in my research.
Throughout my PhD studies, I had the opportunity to collaborate as a visiting researcher
with the Humans Interacting with Computers (HICUP) lab at the University of Primorska, Slovenia (under the supervision of Prof. Klen Čopič Pucihar and Prof. Matjaž Kljun),
the Information eXperience (IX) Lab at the University of Texas at Austin, USA (under the supervision of Prof. Jacek Gwizdka),
and the Advanced Mixed Reality Interfaces Lab at the Graz University of Technology, Austria (under the supervision of Prof. Alexander Plopski).
kayhan DOT latifzade AT_SYMBOL uni DOT lu
Maison du Nombre 6, avenue de la Fonte L-4364 Esch-sur-Alzette, Luxembourg
Publication
- Kayhan Latifzadeh, Jacek Gwizdka, Luis A. Leiva. (2025). A Versatile Dataset of Mouse and Eye Movements on Search Engine Results Pages. In Proceedings of the 48th International ACM SIGIR Conference on Research and Development in Information Retrieval. In Press.
- Mario Villaizán-Vallelado, Matteo Salvatori, Kayhan Latifzadeh, Antonio Penta, Luis A. Leiva, Ioannis Arapakis. (2025). AdSight: Scalable and Accurate Quantification of User Attention in Multi-Slot Sponsored Search. In Proceedings of the 48th International ACM SIGIR Conference on Research and Development in Information Retrieval. In Press.
- Kayhan Latifzadeh, Luis A. Leiva, Klen Čopič Pucihar, Matjaž Kljun, Iztok Devetak, Lili Steblovnik. (2025). Assessing Medical Training Skills via Eye and Head Movements. In Proceedings of the 33rd ACM Conference on User Modeling, Adaptation and Personalization (UMAP). In Press.
- Kayhan Latifzadeh, Luis A. Leiva. (2025). Thalamus: A User Simulation Toolkit for Prototyping Multimodal Sensing Studies. In Adjunct Proceedings of the 33rd ACM Conference on User Modeling, Adaptation and Personalization (UMAP LBR). In Press.
- Syrine Haddad, Kayhan Latifzadeh, Saravanakumar Duraisamy, Jean Vanderdonckt, Olfa Daassi, Safya Belghith, Luis A. Leiva. (2024). Good GUIs, Bad GUIs: Affective Evaluation of Graphical User Interfaces. In Proceedings of the 32nd ACM Conference on User Modeling, Adaptation and Personalization (UMAP). DOI
- Kayhan Latifzadeh, Nima Gozalpour, V Javier Traver, Tuukka Ruotsalo, Aleksandra Kawala-Sterniuk, Luis A. Leiva. (2024). Efficient Decoding of Affective States from Video-elicited EEG Signals: An Empirical Investigation. ACM Transactions on Multimedia Computing, Communications and Applications. ACM New York, NY. DOI
- Kayhan Latifzadeh, Luis A. Leiva. (2022). Gustav: Cross-device Cross-computer Synchronization of Sensory Signals. In Adjunct Proceedings of the 35th Annual ACM Symposium on User Interface Software and Technology. 1--3. DOI
- Reza Sarailoo, Kayhan Latifzadeh, S Hamid Amiri, Alireza Bosaghzadeh, Reza Ebrahimpour. (2022). Assessment of instantaneous cognitive load imposed by educational multimedia using electroencephalography signals. Frontiers in Neuroscience. 16, 744737. Frontiers. DOI
Academic Service
- Student Volunteers Co-Chair, 2024 ACM conference on Conversational User Interfaces (CUI '24), Luxembourg City, Luxembourg, 8 to 10 July 2024
- Student Volunteer, The 8th Summer School on Computational Interaction (CIX'24), Belval, Luxembourg, 3 to 7 June 2024
- Student Volunteer, 2023 ACM Conference on Human Factors in Computing Systems (CHI '23), Hamburg, Germany, 23 to 28 April 2023
Teaching
- Web Development 1 - Teaching Assistant
Teaching practical sessions (JavaScript basics), grading exams
Faculty of Science, Technology and Medicine (FSTM), University of Luxembourg
Belval, Luxembourg, 2022-2025
- Human-Computer Interaction (HCI) - Guest Lecturer
Delivered a presentation on the affective evaluation of graphical user interfaces.
Faculty of Science, Technology and Medicine (FSTM), University of Luxembourg
Belval, Luxembourg, 2024-2025
Alumni
- Hugo Barthelemy, BSc student and research assistant, University of Luxembourg, 2023-2025
- Bachelor Semester Project; Gesture Symphony: Playing music using hand gestures [GitHub Repo]
- Bachelor Semester Project; AdSight: Social Media Marketing and Advertising Optimization through Eye Tracking [GitHub Repo]
- Bachelor Semester Project; Exploring Jupiter’s Galilean Moons: A Comparative Study of VR Learning vs. Traditional Methods
- David Pereira de Magalhaes, BSc student, University of Luxembourg, 2025
- Bachelor Semester Project; TEMO: Emotion Recognition Based Drone Interaction [GitHub Repo]
- Saad Shakeel, BSc student, University of Luxembourg, 2024
- Bachelor Semester Project; Designing a Website to Measure User Emotions During Multimedia Interaction: Valence and Arousal Reporting
- Giorgos Kotsias, BSc student, University of Luxembourg, 2024
- Bachelor Semester Project; Design and Implementation of a Website to Measure User Cognitive Load During Memory Games of Varying Difficulty
- Joao Bernardo Sousa Faria, BSc student, University of Luxembourg, 2024
- Bachelor Semester Project; GIFective: A Website for Grading GIFs in Conveying Emotional Moods
- Karyna Ouahrani, BSc student, University of Luxembourg, 2023
- Bachelor Semester Project; BeatsLab: Development of a Web Application for Drumming and Exploring Usability Methods on it
Personal
In my spare time, I enjoy reading books, with a particular interest in novels, psychology, and cognitive neuroscience.
Feel free to check out my Goodreads page, though I do not update it regularly. I also play video games, primarily on PlayStation.
And, I enjoy walking a lot.
Oh, and one more thing—if I have the time during the weekend, I love cooking, mainly Persian cuisine.