Cristian Castilho finaliza o seu Doutoramento

Tema da Tese: Priorização de indicadores pró-ativos operacionais em segurança e saúde ocupacional

Autor: Cristian dos Santos Castillo

Orientadores: Pedro Miguel Martins Arezes; Mohammed Shahriari; Fabricio Casarejos

Data: 23/09/2021

Programa Doutoral:  Programa Doutoral em Engenharia Industrial e de Sistemas

Abstract: Humans unconsciously rely on patterns of non-verbal language, such as posture, gestures and body
motion, to infer the effectiveness of the communication in their social interactions. As humans we have
an innate ability to recognize and respond to that social signaling. Social signaling is also present in other
contexts, such as Human Machine Interaction (HMI) and Human Computer Interaction (HCI), though for
the most part it is a communication modality ignored during the interaction.
Our daily dependence on computerized systems emphasizes the need and importance of good interaction
quality.
Recent technological developments brought to the market a new generation of non-evasive low-cost
sensors, such as 3D cameras and motion sensors, enabling computers to read subtle aspects and cues
of the human behavior, in a HCI interaction.
In this thesis we propose a Support Vector Machine model that infers the level of task difficulty of a
user while interacting with a tablet using features extracted from body motion, hands’ motion and other
interaction metrics.
This is a contribution to Human-Computer Interaction field by exploring social signals, extracted using
commercial sensors, to describe and understand the context and proposing a model that’s combines
motion’s, gestures’ features and device’s log-file information to infer users task difficulty.

Abstract: Humans unconsciously rely on patterns of non-verbal language, such as posture, gestures and body
motion, to infer the effectiveness of the communication in their social interactions. As humans we have
an innate ability to recognize and respond to that social signaling. Social signaling is also present in other
contexts, such as Human Machine Interaction (HMI) and Human Computer Interaction (HCI), though for
the most part it is a communication modality ignored during the interaction.
Our daily dependence on computerized systems emphasizes the need and importance of good interaction
quality.
Recent technological developments brought to the market a new generation of non-evasive low-cost
sensors, such as 3D cameras and motion sensors, enabling computers to read subtle aspects and cues
of the human behavior, in a HCI interaction.
In this thesis we propose a Support Vector Machine model that infers the level of task difficulty of a
user while interacting with a tablet using features extracted from body motion, hands’ motion and other
interaction metrics.
This is a contribution to Human-Computer Interaction field by exploring social signals, extracted using
commercial sensors, to describe and understand the context and proposing a model that’s combines
motion’s, gestures’ features and device’s log-file information to infer users task difficulty.