Please use this identifier to cite or link to this item:
http://dspace.aiub.edu:8080/jspui/handle/123456789/1431
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.date.accessioned | 2023-10-09T09:40:06Z | - |
dc.date.available | 2023-10-09T09:40:06Z | - |
dc.date.issued | 2022-12 | - |
dc.identifier.other | http://doi.org/10.1016/j.mex.2022.101936 | - |
dc.identifier.uri | http://dspace.aiub.edu:8080/jspui/handle/123456789/1431 | - |
dc.description.abstract | Human detection is an important task in computer vision. It is one of the most important tasks in global security and safety monitoring. In recent days, Deep Learning has improved human detection technology. Despite modern techniques, there are very few optimal techniques to construct networks with a small size, deep architecture, and fast training time while maintaining accuracy. ReSTiNet is a novel small convolutional neural network that overcomes the problems of network size, detection speed, and accuracy. The developed ReSTiNet contains fire modules by evaluating their number and position in the network to minimize the model parameters and network size. To improve the detection speed and accuracy of ReSTiNet, the residual block within the fire modules is carefully designed to increase the feature propagation and maximize the information flow in the network. The developed approach compresses the well-known Tiny-YOLO architecture while improving the following features: (i) small model size, (ii) faster detection speed, (iii) resolution of overfitting, and (iv) better performance than other compact networks such as SqueezeNet and MobileNet in terms of mAP on the Pascal VOC and MS COCO datasets. | en_US |
dc.title | ReSTiNet: An Efficient Deep Learning Approach to Improve Human Detection Accuracy | en_US |
Appears in Collections: | Publications: Journals |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
journal (6).docx | 6.28 MB | Microsoft Word XML | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.