Understanding the role of emotion in decision making process: using machine learning to analyze physiological responses to visual, auditory, and combined stimulation.

International Affective Digital Sounds (IADS) International Affective Pictures System (IAPS) biomedical signal processing emotions machine learning physiological responses

Journal

Frontiers in human neuroscience
ISSN: 1662-5161
Titre abrégé: Front Hum Neurosci
Pays: Switzerland
ID NLM: 101477954

Informations de publication

Date de publication:
2023
Historique:
received: 31 08 2023
accepted: 11 12 2023
medline: 23 1 2024
pubmed: 23 1 2024
entrez: 23 1 2024
Statut: epublish

Résumé

Emotions significantly shape decision-making, and targeted emotional elicitations represent an important factor in neuromarketing, where they impact advertising effectiveness by capturing potential customers' attention intricately associated with emotional triggers. Analyzing biometric parameters after stimulus exposure may help in understanding emotional states. This study investigates autonomic and central nervous system responses to emotional stimuli, including images, auditory cues, and their combination while recording physiological signals, namely the electrocardiogram, blood volume pulse, galvanic skin response, pupillometry, respiration, and the electroencephalogram. The primary goal of the proposed analysis is to compare emotional stimulation methods and to identify the most effective approach for distinct physiological patterns. A novel feature selection technique is applied to further optimize the separation of four emotional states. Basic machine learning approaches are used in order to discern emotions as elicited by different kinds of stimulation. Electroencephalographic signals, Galvanic skin response and cardio-respiratory coupling-derived features provided the most significant features in distinguishing the four emotional states. Further findings highlight how auditory stimuli play a crucial role in creating distinct physiological patterns that enhance classification within a four-class problem. When combining all three types of stimulation, a validation accuracy of 49% was achieved. The sound-only and the image-only phases resulted in 52% and 44% accuracy respectively, whereas the combined stimulation of images and sounds led to 51% accuracy. Isolated visual stimuli yield less distinct patterns, necessitating more signals for relatively inferior performance compared to other types of stimuli. This surprising significance arises from limited auditory exploration in emotional recognition literature, particularly contrasted with the pleathora of studies performed using visual stimulation. In marketing, auditory components might hold a more relevant potential to significantly influence consumer choices.

Identifiants

pubmed: 38259333
doi: 10.3389/fnhum.2023.1286621
pmc: PMC10800655
doi:

Types de publication

Journal Article

Langues

eng

Pagination

1286621

Informations de copyright

Copyright © 2024 Polo, Farabbi, Mollura, Mainardi and Barbieri.

Déclaration de conflit d'intérêts

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Auteurs

Edoardo Maria Polo (EM)

SpinLabs, Dipartimento di Elettronica, Informazione e Bioingegneria, Politecnico di Milano, Milan, Italy.

Andrea Farabbi (A)

SpinLabs, Dipartimento di Elettronica, Informazione e Bioingegneria, Politecnico di Milano, Milan, Italy.

Maximiliano Mollura (M)

SpinLabs, Dipartimento di Elettronica, Informazione e Bioingegneria, Politecnico di Milano, Milan, Italy.

Luca Mainardi (L)

SpinLabs, Dipartimento di Elettronica, Informazione e Bioingegneria, Politecnico di Milano, Milan, Italy.

Riccardo Barbieri (R)

SpinLabs, Dipartimento di Elettronica, Informazione e Bioingegneria, Politecnico di Milano, Milan, Italy.

Classifications MeSH