<?xml version="1.0" encoding="UTF-8"?>



<records>

  <record>
    <language>eng</language>
          <publisher>Oriental Scientific Publishing Company</publisher>
        <journalTitle>Biomedical and Pharmacology Journal</journalTitle>
          <issn>0974-6242</issn>
            <publicationDate>2022-03-31</publicationDate>
    
        <volume>15</volume>
        <issue>1</issue>

 
    <startPage>277</startPage>
    <endPage>284</endPage>

	 
      <doi>10.13005/bpj/2364</doi>
        <publisherRecordId>42803</publisherRecordId>
    <documentType>article</documentType>
    <title language="eng">Classification of Cervical Cytology Overlapping Cell Images with Transfer Learning Architectures</title>

    <authors>
	 


      <author>
       <name>Pallavi V. Mulmule</name>

 
		
	<affiliationId>1</affiliationId>
      </author>
    

	 


      <author>
       <name>Rajendra D. Kanphade</name>


		
	<affiliationId>1</affiliationId>

      </author>
    

	

	


	


	
    </authors>
    
	    <affiliationsList>
	    
		
		<affiliationName affiliationId="1">Department of E and  TC, D. Y. Patil Institute of Technology, Pimpri, Pune, India</affiliationName>
    

		
		<affiliationName affiliationId="2">JSPM’s Jayawantrao Sawant College of Engineering, Hadpasar, Pune, India.</affiliationName>
    
		
		
		
		
	  </affiliationsList>






    <abstract language="eng">Cervical cell classification is a clinical biomarker in cervical cancer screening at early stages. An accurate and early diagnosis plays a vital role in preventing the cervical cancer. Recently, transfer learning using deep convolutional neural networks; have been deployed in many biomedical applications. The proposed work aims at applying the cutting edge pre-trained networks: AlexNet, ImageNet and Places365, to cervix images to detect the cancer. These pre-trained networks are fine-tuned and retrained for cervical cancer augmented data with benchmark CERVIX93 dataset available publically. The models were evaluated on performance measures viz; accuracy, precision, sensitivity, specificity, F-Score, MCC and kappa score. The results reflect that the AlexNet model is best for cervical cancer prediction with 99.03% accuracy and 0.98 of kappa coefficient showing a perfect agreement. Finally, the significant success rate makes the AlexNet model a useful assistive tool for radiologist and clinicians to detect the cervical cancer from pap-smear cytology images.</abstract>

    <fullTextUrl format="html">https://biomedpharmajournal.org/vol15no1/classification-of-cervical-cytology-overlapping-cell-images-with-transfer-learning-architectures/</fullTextUrl>

<keywords language="eng">

      
        <keyword>Alexnet</keyword>
      

      
        <keyword> Cervical cancer</keyword>
      

      
        <keyword><em> </em>Cytology Images</keyword>
      

      
        <keyword><em> </em>Convolution Neural Network Models</keyword>
      

      
        <keyword> Deep learning architectures</keyword>
      

      
        <keyword> ImageNet</keyword>
      
</keywords>
  </record>
</records>