Please use this identifier to cite or link to this item: https://elib.vku.udn.vn/handle/123456789/2742
Full metadata record
DC FieldValueLanguage
dc.contributor.authorNguyen, Duy Linh-
dc.contributor.authorVo, Xuan Thuy-
dc.contributor.authorAdri, Priadana-
dc.contributor.authorKang-Hyun, Jo-
dc.date.accessioned2023-09-26T02:15:56Z-
dc.date.available2023-09-26T02:15:56Z-
dc.date.issued2023-07-
dc.identifier.isbn978-3-031-36886-8-
dc.identifier.urihttps://link.springer.com/chapter/10.1007/978-3-031-36886-8_9-
dc.identifier.urihttp://elib.vku.udn.vn/handle/123456789/2742-
dc.descriptionLecture Notes in Networks and Systems (LNNS, volume 734); CITA: Conference on Information Technology and its Applications; pp: 102-113.vi_VN
dc.description.abstractNowadays, YOLOv5 is one of the most widely used object detection network architectures in real-time systems for traffic management and regulation. To develop a parking management tool, this paper proposes a car detection network based on redesigning the YOLOv5 network architecture. This research focuses on network parameter optimization using lightweight modules from EfficientNet and PP-LCNet architectures. The proposed network is trained and evaluated on two benchmark datasets which are the Car Parking Lot Dataset and the Pontifical Catholic University of Parana+ Dataset and reported on [email protected] and [email protected]:0.95 measurement units. As a result, this network achieves the best performances at 95.8 % and 97.4 % of [email protected] on the Car Parking Lot Dataset and the Pontifical Catholic University of Parana+ Dataset, respectively.vi_VN
dc.language.isoenvi_VN
dc.publisherSpringer Naturevi_VN
dc.subjectConvolutional neural network (CNN)vi_VN
dc.subjectEfficientNetvi_VN
dc.subjectPP-LCNetvi_VN
dc.subjectParking managementvi_VN
dc.subjectYOLOv5vi_VN
dc.titleCar Detector Based on YOLOv5 for Parking Managementvi_VN
dc.typeWorking Papervi_VN
Appears in Collections:CITA 2023 (International)

Files in This Item:

 Sign in to read



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.