Please use this identifier to cite or link to this item: http://dspace.aiub.edu:8080/jspui/handle/123456789/2927
Title: Robust Multi-Weather Pothole Detection: An Enhanced YOLOv9 Trained on the MWPD Dataset
Authors: Parvin, Shahnaj
Munsy, Foysal
Rahat, Md Tanzeem
Nahar, Aminun
Nur, Kamruddin
Ghose, Debasish
Keywords: Pothole detection
Computer vision Image
Deep learning
YOLO
Multi-weather road safety
Issue Date: 22-Oct-2025
Publisher: Elsevier
Abstract: Real-time pothole detection is crucial for advancing road safety and infrastructure management, particularly in challenging multi-weather conditions. Deep learning-based techniques, especially object detection models, have demonstrated higher accuracy than other approaches. This research proposes an improved YOLOv9 model, specifically designed for detecting road potholes in multi-weather conditions. To optimize performance, ADown layers were replaced with standard convolutional (Conv) layers at specific positions, enhancing feature extraction efficiency while reducing computational load. A custom dataset, the Multi-Weather Pothole Detection (MWPD) dataset, was developed, comprising roadway pothole images captured under varied environmental conditions. Data augmentation techniques, including color perturbation, contrast adjustment, Gaussian noise addition, flipping, and rotation, were applied to enhance training robustness. To ensure a reliable evaluation, a 5-fold cross-validation strategy was employed, partitioning the MWPD dataset into five equal subsets to minimize bias and variance. Using the evaluation benchmarks, the improved YOLOv9 achieved an average mAP@50 of 95% and an F1-score of 91%, outperforming the baseline YOLOv9 model on the MWPD dataset.
URI: http://dspace.aiub.edu:8080/jspui/handle/123456789/2927
ISSN: 2590-1230
Appears in Collections:Publications: Journals

Files in This Item:
File Description SizeFormat 
An enhanced YOLOv9 trained on the MWPD dataset_2025.docx3.83 MBMicrosoft Word XMLView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.