<?xml version="1.0" encoding="UTF-8"?>



<records>

  <record>
    <language>eng</language>
          <publisher>Oriental Scientific Publishing Company</publisher>
        <journalTitle>Biomedical and Pharmacology Journal</journalTitle>
          <issn>0974-6242</issn>
            <publicationDate>2025-02-20</publicationDate>
    
        <volume>18</volume>
        <issue>March Spl Edition</issue>

 
    <startPage>245</startPage>
    <endPage>255</endPage>

	 
      <doi>10.13005/bpj/3085 </doi>
        <publisherRecordId>64270</publisherRecordId>
    <documentType>article</documentType>
    <title language="eng">AI-Driven Multimodal Stress Detection: A Comparative Study</title>

    <authors>
	 


      <author>
       <name>Sangita Ajit Patil</name>

 
		
	<affiliationId>1</affiliationId>
      </author>
    

	 


      <author>
       <name>Ajay Namdeorao Paithane</name>


		
	<affiliationId>2</affiliationId>

      </author>
    

	

	


	


	
    </authors>
    
	    <affiliationsList>
	    
		
		<affiliationName affiliationId="1">Department of Electronics and Telecommunication, Faculty at Pimpri Chinchwad College of Engineering (PCCOE) and Research Scholar at JSPMs Rajarshi Shahu College of Engineering, Savitribai Phule Pune University (SPPU), Pune, India.</affiliationName>
    

		
		<affiliationName affiliationId="2">Department of Electronics and Telecommunication, Faculty at Dr. D.Y.Patil Institute of Engineering Management and Research (DYPIEMR), Savitribai Phule Pune University (SPPU), Pune, India</affiliationName>
    
		
		
		
		
	  </affiliationsList>






    <abstract language="eng">Stress affects mental and physical health, contributing to cardiovascular diseases and cognitive disorders, and early detection plays a crucial role in mitigating these risks. This study enhances stress detection by analyzing electroencephalography (EEG) signals from the DEAP ( A Database using Physiological Signals) data set and electrocardiogram (ECG) signals from the WESAD (Wearable Stress and Affect Detection) data set, with EEG offering a cost-effective solution and ECG providing detailed cardiovascular insights. It compares individual sensor analysis with multi-sensor fusion, demonstrating that fusion improves accuracy, as the ECG model achieves 91.79% accuracy, the EEG model reaches 96.6%, the feature-level fusion model achieves 98.6%, and the score-level fusion model achieves 97.8%. Using the Archimedes Optimization Algorithm (AoA) and Analytical Hierarchical Process (AHP) for feature selection and a hybrid Deep Convolutional Neural Network-Long Short-Term Memory (DCNN-LSTM) model for processing, the study highlights the effectiveness of a multi modal approach for real- time, accurate stress monitoring in clinical and industrial settings. It also integrates additional modalities and refines methods to enhance the system further, positioning AI-driven multimodal systems as powerful tools for early intervention and improved mental health management.</abstract>

    <fullTextUrl format="html">https://biomedpharmajournal.org/vol18marchspledition/ai-driven-multimodal-stress-detection-a-comparative-study/</fullTextUrl>

<keywords language="eng">

      
        <keyword>Archimedes optimization algorithm (AOA)</keyword>
      

      
        <keyword>  Deep Convolution Neural  Network (DCNN)</keyword>
      

      
        <keyword> Electrocardiogram (ECG)</keyword>
      

      
        <keyword> Electroencephalography (EEG)</keyword>
      

      
        <keyword> Long Short Term Memory (LSTM)</keyword>
      
</keywords>
  </record>
</records>