Enrique is a researcher at SINTEF, Oslo, Norway. His research interests include: Ambient Intelligence, Security, Context Recognition, Machine learning, Pervasive Healthcare, Wearable sensors, etc.

email: e.g.mx (at) ieee.org


Mental healthcare adaptive systems and monitoring with wearable devices. Personalized monitoring systems for mental healthcare applications. Analyze sensor data (smartphones, VR, wrist bands, physiological sensors) using machine learning for episode prediction. Development of algorithms to provide personalized assistance based on current behavior.

Stress Detection through Smartphones' data. With the new advent of wearable devices and their associated embedded sensors, it has become possible to collect data from which contextual and behavioral information can be extracted. This work focuses on detecting different stress levels in working environments by analyzing sensor data from smartphones. 

Papers: Garcia-Ceja, E., Venet Osmani, Oscar Mayora. "Automatic Stress Detection in Working Environments from Smartphones' Accelerometer Data: A First Step". Biomedical and Health Informatics, IEEE Journal of, vol. 20, no. 4, pp. 1053-1060, July 2016.

Long-term activities segmentation. Activity recognition has become a very active research topic. High accuracy for recognizing simple activities has been achieved, however complex activities are still challenging. Several works have used a rich collection of sensors to detect complex activities. In this work we aim to segment (infer start and end time) complex activities with a single accelerometer.

Papers: Garcia-Ceja, E., R. Brena, J.C. Carrasco-Jimenez, L. Garrido. "Long-Term Activity Recognition from Wristwatch Accelerometer Data". Sensors journal v.14, No.12, pp.22500-22524, doi:10.3390/s141222500, MDPI, 2014. [pdf]

Contextualized hand gesture recognition. Most of the previous works in hand gesture recognition focus in increasing the accuracy and robustness of the systems, however little has been done to understand the context in which the gestures are performed, i.e, the same gesture could mean different things depending on the context and situation. In this work, we used location information in order to contextualize the gestures. The system constantly identifies the location of the user so when he/she performs a gesture the system can perform an action based on this information.

Papers: Garcia-Ceja, E. & Brena, R. & Galván-Tejada, C. "Contextualized Hand Gesture Recognition with Smartphones". MCPR 2014: 6th Mexican Conference on Pattern Recognition, June 25-28th, Cancun, Mexico, Lecture Notes in Computer Science Volume 8495, pp 122-131, 2014. [pdf]