Automated Pain Assessment in Children Using Electrodermal Activity and Video Data Fusion via Machine Learning


Susam B. T., Riek N., Akcakaya M., Xu X., De Sa V., Nezamfar H., ...Daha Fazla

IEEE Transactions on Biomedical Engineering, cilt.69, sa.1, ss.422-431, 2022 (SCI-Expanded) identifier identifier

  • Yayın Türü: Makale / Tam Makale
  • Cilt numarası: 69 Sayı: 1
  • Basım Tarihi: 2022
  • Doi Numarası: 10.1109/tbme.2021.3096137
  • Dergi Adı: IEEE Transactions on Biomedical Engineering
  • Derginin Tarandığı İndeksler: Science Citation Index Expanded (SCI-EXPANDED), Scopus, Academic Search Premier, Aerospace Database, Applied Science & Technology Source, Biotechnology Research Abstracts, Business Source Elite, Business Source Premier, Communication Abstracts, Compendex, Computer & Applied Sciences, EMBASE, INSPEC, MEDLINE, Metadex, Civil Engineering Abstracts
  • Sayfa Sayıları: ss.422-431
  • Anahtar Kelimeler: computer vision, Electrodermal activity (EDA), facial expression, galvanic skin response (GSR), pain assessment
  • Hakkari Üniversitesi Adresli: Hayır

Özet

Objective: Pain assessment in children continues to challenge clinicians and researchers, as subjective experiences of pain require inference through observable behaviors, both involuntary and deliberate. The presented approach supplements the subjective self-report-based method by fusing electrodermal activity (EDA) recordings with video facial expressions to develop an objective pain assessment metric. Such an approach is specifically important for assessing pain in children who are not capable of providing accurate self-pain reports, requiring nonverbal pain assessment. We demonstrate the performance of our approach using data recorded from children in post-operative recovery following laparoscopic appendectomy. We examined separately and combined the usefulness of EDA and video facial expression data as predictors of children's self-reports of pain following surgery through recovery. Findings indicate that EDA and facial expression data independently provide above chance sensitivities and specificities, but their fusion for classifying clinically significant pain vs. clinically nonsignificant pain achieved substantial improvement, yielding 90.91% accuracy, with 100% sensitivity and 81.82% specificity. The multimodal measures capitalize upon different features of the complex pain response. Thus, this paper presents both evidence for the utility of a weighted maximum likelihood algorithm as a novel feature selection method for EDA and video facial expression data and an accurate and objective automated classification algorithm capable ofdiscriminating clinically significant pain from clinically nonsignificant pain in children.