Please use this identifier to cite or link to this item:
http://dspace.aiub.edu:8080/jspui/handle/123456789/2856
Title: | LungCANet: A Novel Deep Co-attention Convolutional Neural Network Architecture for High-Precision Lung Cancer Morphological Analysis and Classification |
Authors: | Ahmmed, Md. Mortuza Babu, Ashraful Rahman, M. Mostafizur Mahmud, Mufti Ahammad, Mejbah |
Keywords: | Lung Cancer, Neural Network |
Issue Date: | 24-Jun-2025 |
Publisher: | Springer Nature |
Abstract: | In the realm of lung cancer diagnostics, traditional imaging and classification methodologies exhibit notable limitations, primarily due to their inability to effectively process and analyze the intricate morphological variations of lung cancer from medical imaging data. In order to address this issue, this study introduces LungCANet, an innovative deep learning framework tailored for the precise diagnosis and classification of lung cancer. Utilizing cutting-edge mechanisms such as Squeeze-and-Excitation (SE) blocks and residual connections, LungCANet significantly enhances the diagnostic accuracy by effectively discerning critical features within complex lung imaging data. Through a comprehensive experimental analysis, this research validates LungCANet superior performances against conventional diagnostic methods, demonstrating its potential to transform early cancer detection and treatment strategies. The efficacy of LungCANet was rigorously evaluated against comprehensive datasets, showing an average accuracy of 97.33% on the training set and significant performance gains on validation datasets with accuracy of 98.61%. These results underscore LungCANet potential to significantly advance the early detection and classification of lung cancer, setting a new benchmark for diagnostic performance with its state-of-the-art architecture. |
URI: | http://dspace.aiub.edu:8080/jspui/handle/123456789/2856 |
ISSN: | 978-981-96-6588-4 (ISBN) |
Appears in Collections: | Publication: Conference |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.