Development and Implementation of a Functional Module for the Classification of Facial Expressions of Pain Using Supervised Artificial Intelligence Models

Authors

  • GABRIEL VEGA MARTINEZ
  • Cinthya Lourdes Toledo-Peral
  • Jorge Airy Mercado Gutiérrez
  • Carlos Alonso Montoya Ledesma
  • Sonia Patricia Romano Riquer Instituto Nacional de Rehabilitación “Luis Guillermo Ibarra Ibarra” https://orcid.org/0009-0003-8704-6510

Keywords:

Facial expressions, pain, artificial intelligence, convolutional neural networks, automatic classification

Abstract

Introduction:
The objective evaluation of pain based on facial expressions represents a significant challenge in health and technology, as pain perception is subjective and its clinical quantification is complex. Facial expressions contain key information about the pain experience, and their automatic analysis facilitates decision-making in clinical monitoring, rehabilitation, or human-machine interfaces. Convolutional Neural Networks (CNNs) have shown high potential in the classification of images, including facial patterns associated with emotions and pain.

Objective:
To develop and implement a functional module based on CNNs for the automatic classification of facial expressions of pain, integrating normalization, data augmentation, and advanced performance evaluation.

Methodology:
The OSF Facial Expression of Pain dataset was used, consisting of 1,200 images from 60 participants labeled as pain and no pain. Images were loaded in MATLAB, resized to [224 224 3] px, and organized into a training set with 70% of images per label and a validation set with the remaining images. Data augmentation included random rotation (-10° to 10°), X and Y translation (-5 to 5 px), and horizontal reflection to enhance model robustness. The architecture consisted of a CNN with three convolutional blocks containing 8, 16, and 32 filters respectively, each followed by batch normalization and ReLU activation, with max pooling in the first two blocks, and a fully connected layer with two neurons for binary classification, followed by softmax and classification layers. Training was performed using the Stochastic Gradient Descent with Momentum (SGDM) optimizer, with an initial learning rate of 0.01, for 18 epochs and validation every 30 iterations. Accuracy, recall, and F1-score were calculated for each class. The entire process ran on a single CPU (AMD Ryzen™ Z1 Extreme) in approximately 2 minutes and 29 seconds.

Results:
The model achieved a global validation accuracy of 97.81%, showing a stable upward trend during training with convergence of the loss function toward zero. The average processing time was approximately 0.115 seconds per image, demonstrating feasibility for real-time applications. For the Pain class, recall was 99.10% and F1-score was 96.49%, while for the No Pain class, recall was 96.58% and F1-score was 99.12%.

Conclusions:
The developed functional module demonstrates adequate performance in the automatic classification of facial expressions of pain using supervised AI, integrating data augmentation and advanced evaluation metrics. This development shows potential for integration into clinical monitoring systems, rehabilitation, and assistive devices, contributing to technological innovation in digital health for objective pain assessment.

Publication Facts

Metric
This article
Other articles
Peer reviewers 
0
2.4

Reviewer profiles  N/A

Author statements

Author statements
This article
Other articles
Data availability 
N/A
16%
External funding 
N/A
32%
Competing interests 
No
11%
Metric
This journal
Other journals
Articles accepted 
20%
33%
Days to publication 
129
145

Indexed in

Editor & editorial board
profiles
Academic society 
N/A

Published

2025-11-11

How to Cite

1.
VEGA MARTINEZ G, Toledo-Peral CL, Mercado Gutiérrez JA, Montoya Ledesma CA, Romano Riquer SP. Development and Implementation of a Functional Module for the Classification of Facial Expressions of Pain Using Supervised Artificial Intelligence Models. Invest. Discapacidad [Internet]. 2025 Nov. 11 [cited 2025 Nov. 20];11(S1). Available from: https://dsm.inr.gob.mx/indiscap/index.php/INDISCAP/article/view/474

Most read articles by the same author(s)

Similar Articles

<< < 1 2 3 4 5 6 7 8 9 10 > >> 

You may also start an advanced similarity search for this article.