In everyday life, humans engage in complex, adaptive, and personalized interactions with partners whose perceptual and motor abilities may differ from their own. Recent advancements in the collaborative technologies field have significantly improved the accessibility of social robots as interaction partners. However, a significant gap remains in understanding how human-robot interaction (HRI) influences human perceptual and cognitive systems. This thesis investigates how spatial perception and attentional mechanisms are modulated during physical collaboration with robots. Specifically, it explores whether engaging in joint tasks with a humanoid robot can extend the “near-hand effect”, the enhancement of spatial attention near one’s own hand, towards the robot's hand. Previous work has shown that a human partner’s hand can be integrated into one’s spatial representation following joint action. Here, I test whether a similar extension of the body schema occurs with a robotic partner and identify the conditions that support or hinder this integration. The impact of my research is twofold: (i) understanding and modeling attentional mechanisms underlying the effectiveness of human interactions and (ii) developing technologies that leverage these mechanisms to improve collaboration with robots. In a series of behavioral experiments, I systematically investigated whether and how collaborative interaction with a robot influences the development of an interpersonal joint body schema and modulates human visuospatial attention. Using the humanoid robot iCub, I implemented a collaborative physical joint action, designed to require effective coordination and joint goals. Visuospatial attention was measured before and after the collaborative HRI using the Posner cueing task, a well-established method to study how attention is oriented and shifted in space. Key interaction variables were manipulated, including sensory conditions (vision versus blindfolded interaction) and robot behavior (adaptive versus non-adaptive interaction). To further examine these mechanisms in a more ecologically valid context, I developed a human–human version of the joint task with two levels of coordination difficulty (physical versus non-physical interaction). This allowed testing whether interpersonal synchronization is a core mechanism underlying attentional modulation in joint action. Findings revealed that collaborative interaction significantly enhanced target detection near the robot’s hand, an effect absent before the interaction, demonstrating that spatial attention can be redirected toward a robotic partner. Vision proved essential, as blindfolded interaction did not produce this effect. The robot’s adaptive behavior supported personalized interactions, smoother coordination, and improved task performance, while also eliciting a reliable attentional bias near the robot’s hand. The human–human study revealed that synchronization alone was insufficient to modulate visuospatial attention. In the non-physical condition, the attentional bias even decreased after the interaction, suggesting that additional mechanisms beyond motor synchronization are required to support attentional alignment. Overall, this thesis shows that physical collaboration with a humanoid robot reshapes human visuospatial attention and contributes to the formation of a joint body schema, providing insights into how social and sensorimotor variables modulate attention in collaborative settings. The knowledge gained from this research advances our understanding of embodied cognition in interaction with robotic agents and has significant implications for the design and deployment of robotic systems across various domains, from rehabilitation and prosthetics to human-robot teaming in industrial and service environments. Future work will explore joint action with non-anthropomorphic systems, such as industrial manipulators, to determine whether attentional processing depends on anthropomorphic features or the collaborative nature of the task.
Developing Physical Human–Robot Collaboration: Visuospatial Attention, Adaptation, and Interpersonal Synchronization
SCORZA AZZARÀ, GIULIA
2025-11-25
Abstract
In everyday life, humans engage in complex, adaptive, and personalized interactions with partners whose perceptual and motor abilities may differ from their own. Recent advancements in the collaborative technologies field have significantly improved the accessibility of social robots as interaction partners. However, a significant gap remains in understanding how human-robot interaction (HRI) influences human perceptual and cognitive systems. This thesis investigates how spatial perception and attentional mechanisms are modulated during physical collaboration with robots. Specifically, it explores whether engaging in joint tasks with a humanoid robot can extend the “near-hand effect”, the enhancement of spatial attention near one’s own hand, towards the robot's hand. Previous work has shown that a human partner’s hand can be integrated into one’s spatial representation following joint action. Here, I test whether a similar extension of the body schema occurs with a robotic partner and identify the conditions that support or hinder this integration. The impact of my research is twofold: (i) understanding and modeling attentional mechanisms underlying the effectiveness of human interactions and (ii) developing technologies that leverage these mechanisms to improve collaboration with robots. In a series of behavioral experiments, I systematically investigated whether and how collaborative interaction with a robot influences the development of an interpersonal joint body schema and modulates human visuospatial attention. Using the humanoid robot iCub, I implemented a collaborative physical joint action, designed to require effective coordination and joint goals. Visuospatial attention was measured before and after the collaborative HRI using the Posner cueing task, a well-established method to study how attention is oriented and shifted in space. Key interaction variables were manipulated, including sensory conditions (vision versus blindfolded interaction) and robot behavior (adaptive versus non-adaptive interaction). To further examine these mechanisms in a more ecologically valid context, I developed a human–human version of the joint task with two levels of coordination difficulty (physical versus non-physical interaction). This allowed testing whether interpersonal synchronization is a core mechanism underlying attentional modulation in joint action. Findings revealed that collaborative interaction significantly enhanced target detection near the robot’s hand, an effect absent before the interaction, demonstrating that spatial attention can be redirected toward a robotic partner. Vision proved essential, as blindfolded interaction did not produce this effect. The robot’s adaptive behavior supported personalized interactions, smoother coordination, and improved task performance, while also eliciting a reliable attentional bias near the robot’s hand. The human–human study revealed that synchronization alone was insufficient to modulate visuospatial attention. In the non-physical condition, the attentional bias even decreased after the interaction, suggesting that additional mechanisms beyond motor synchronization are required to support attentional alignment. Overall, this thesis shows that physical collaboration with a humanoid robot reshapes human visuospatial attention and contributes to the formation of a joint body schema, providing insights into how social and sensorimotor variables modulate attention in collaborative settings. The knowledge gained from this research advances our understanding of embodied cognition in interaction with robotic agents and has significant implications for the design and deployment of robotic systems across various domains, from rehabilitation and prosthetics to human-robot teaming in industrial and service environments. Future work will explore joint action with non-anthropomorphic systems, such as industrial manipulators, to determine whether attentional processing depends on anthropomorphic features or the collaborative nature of the task.| File | Dimensione | Formato | |
|---|---|---|---|
|
phdunige_4376318.pdf
embargo fino al 25/11/2026
Tipologia:
Tesi di dottorato
Dimensione
9.72 MB
Formato
Adobe PDF
|
9.72 MB | Adobe PDF | Visualizza/Apri Richiedi una copia |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.



