Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/139717
Title: Implementing of sensors and actuators for autonomous driving
Authors: Yeo, Darren Zhi Duo
Keywords: Engineering::Electrical and electronic engineering
Issue Date: 2020
Publisher: Nanyang Technological University
Project: B2062-191
Abstract: Autonomous vehicles also known as driverless cars is the uprising technology. Autonomous vehicles exploit the functions of different sensors to enable vehicles to ‘see’ and ‘hear’. Based on the Society of Automotive Engineers International (SAE) [1], there are six levels of driving automation, ranging from level zero to level five, with level zero, one and two mandating driver’s control while level three and thereafter providing autonomous functions to the vehicle, with monitoring by the driver. This project is to implement a level three driving automation to a battery-electric vehicle that will be participating in the Shell Eco Marathon Asia 2020 Autonomous Category by identifying the appropriate sensors and implementing on the vehicle itself. A level three automation would mean that the vehicle will be able to operate without the need of the driver’s operating even though he or she is at the driver seat. However, the driver will be required to take over control when needed. With the level three automation, the vehicle will be able to challenge in the different stages of the competition. Sensors such as the Light Detection and Ranging (LiDAR), RAdio Detection And Ranging (Radar), Global Positioning System (GPS), stereo camera and ultrasonic sensors are popularly engaged in autonomous vehicle. A thorough study cum comparison between the sensors available in the market will be detailed in this report and the sensor identified for use in NV11, will be described. Details on sensors implementation and the electrical circuitry that will be used for the vehicle will all be described. Lastly, the LiDAR will be tested by creating a 2D map using gmapping SLAM process and the ultrasonic sensors will be tested to detect distances in close proximity.
URI: https://hdl.handle.net/10356/139717
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:EEE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
U1720127J_YEOZHIDUODARREN FINAL REPORT.pdf
  Restricted Access
3.28 MBAdobe PDFView/Open

Page view(s)

248
Updated on Jun 24, 2022

Download(s) 50

17
Updated on Jun 24, 2022

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.