Deep Neural Network for Visual Localization of Autonomous Car in ITS Campus Environment
DOI:
https://doi.org/10.12962/jaree.v7i2.365Abstract
Intelligent Car (I-Car) ITS is an autonomous car prototype where one of the main localization methods is obtained through reading GPS data. However the accuracy of GPS readings is influenced by the availability of the information from GPS satellites, in which it often depends on the conditions of the place at that time, such as weather or atmospheric conditions, signal blockage, and density of a land. In this paper we propose the solution to overcome the unavailability of GPS localization information based on the omnidirectional camera visual data through environmental recognition around the ITS campus using Deep Neural Network. The process of recognition is to take GPS coordinate data to be used as an output reference point when the omnidirectional camera takes images of the surrounding environment. Visual localization trials were carried out in the ITS environment with a total of 200 GPS coordinates, where each GPS coordinate represents one class so that there are 200 classes for classification. Each coordinate/class has 96 training images. This condition is achieved for a vehicle speed of 20 km/h, with an image acquisition speed of 30 fps from the omnidirectional camera. By using AlexNet architecture, the result of visual localization accuracy is 49-54%. The test results were obtained by using a learning rate parameter of 0.00001, data augmentation, and the Drop Out technique to prevent overfitting and improve accuracy stability.
References
ITS, “i-Car, Kado Spesial ITS untuk Indonesia”, https://www.its.ac.id/news/2020/08/17/i car-kado-spesial-its-untuk-indonesia/ (accessed Oct 21, 2021)
Kompas, “ITS Luncurkan iCar, Mobil Listrik Otonom Tanpa Pengemudi”, https://www.kompas.com/edu/read/2020/08/23/183000471/its-luncurkan-icar-mobil listrik-otonom-tanpa-pengemudi?page=all, (accessed Oct 21, 2021)
D. Scaramuzza, “Omnidirectional Camera,” 2014, doi: 10.5167/UZH-106115
H.-Y. Lin and C.-H. He, “Mobile Robot Self-Localization Using Omnidirectional Vision with Feature Matching from Real and Virtual Spaces,” Appl. Sci., vol. 11, no. 8, p. 3360, Apr. 2021, doi: 10.3390/app11083360.
J. A. Larcom and H. Liu, “Modeling and characterization of GPS spoofing”, 2013 IEEE International Conference on Technologies for Homeland Security (HST), pp. 729-734, 2013.
A. Mulla, J. Baviskar, A. Baviskar and A. Bhovad, “GPS assisted Standard Positioning Service for navigation and tracking: Review *& implementation”, 2015 International Conference on Pervasive Computing (ICPC), pp. 1-6, 2015.
Tianhao Bai, Bingqing Mei, Long Zhao, Xiaodong Wang, “Machine Learning-Assisted Wireless Power Transfer Based on Magnetic Resonance”, Access IEEE, vol. 7, pp. 109454109459, 2019.
Jingjie Xin, Xin Li, Yongjun Zhang, Lu Zhang, Jianghua Wei, Shanguo Huang, “DNN based Multi-Faults Localization for 5G Coexisting Radio and Optical Wireless Networks”, the Design of Reliable Communication Networks (DRCN) 2021 17th International Conference on, pp. 1-6, 2021.
Wei, J. (2020, September 25). AlexNet: The Architecture that Challenged CNNs. Medium. https://towardsdatascience.com/alexnet-the-architecture-that-challenged-cnns-e406d5297951
Krizhevsky, A., Sutskever, I., & Hinton, G. E. (2012). ImageNet Classification with Deep Convolutional Neural Networks. Advances in Neural Information Processing Systems,25.https://proceedings.neurips.cc/paper/2012/hash/c399862d3b9d6b76c8436e924a68c45b-Abstract.html
An Overview of ResNet and its Variants | by Vincent Feng | Towards Data Science. (n.d.). Retrieved June 11, 2022, from https://towardsdatascience.com/an-overview-of-resnet-and-its-variants-5281e2f56035
Residual Networks (ResNet)—Deep Learning—GeeksforGeeks. (n.d.). Retrieved June 11, 2022, from https://www.geeksforgeeks.org/residual-networks-resnet-deep-learning/
Downloads
Published
Issue
Section
License
Copyright
Submission of a manuscript implies that the submitted work has not been published before (except as part of a thesis or report, or abstract); that it is not under consideration for publication elsewhere; that its publication has been approved by all co-authors. If and when the manuscript is accepted for publication, the author(s) still hold the copyright and retain publishing rights without restrictions. Authors or others are allowed to multiply article as long as not for commercial purposes. For the new invention, authors are suggested to manage its patent before published. The license type is CC-BY-NC 4.0.
Disclaimer
No responsibility is assumed by publisher and co-publishers, nor by the editors for any injury and/or damage to persons or property as a result of any actual or alleged libelous statements, infringement of intellectual property or privacy rights, or products liability, whether resulting from negligence or otherwise, or from any use or operation of any ideas, instructions, procedures, products or methods contained in the material therein.


