Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/176679
Full metadata record
DC FieldValueLanguage
dc.contributor.authorAguilon, Deryl Trinidaden_US
dc.date.accessioned2024-05-20T02:37:22Z-
dc.date.available2024-05-20T02:37:22Z-
dc.date.issued2024-
dc.identifier.citationAguilon, D. T. (2024). 3D mapping with depth-sensing camera. Final Year Project (FYP), Nanyang Technological University, Singapore. https://hdl.handle.net/10356/176679en_US
dc.identifier.urihttps://hdl.handle.net/10356/176679-
dc.description.abstractDepth Sensing cameras have been used in many applications recently, mostly in the use of automated. The ability to observe the world in a Three-Dimensional view has allowed for robots to better interact with and make decisions on the environment around them. As such this project and to demonstrate how to develop a program that is able to interface with a commercialized depth sensing camera and process the generated point cloud such that it would be able to approximate the normal angle of the observed objects surface. This will be completed with the use of libraries such as the Software Development Kit (SDK) 2.0 and the Point Cloud Library (PCL).en_US
dc.language.isoenen_US
dc.publisherNanyang Technological Universityen_US
dc.relationA2011-231en_US
dc.subjectEngineeringen_US
dc.title3D mapping with depth-sensing cameraen_US
dc.typeFinal Year Project (FYP)en_US
dc.contributor.supervisorCuong Dangen_US
dc.contributor.schoolSchool of Electrical and Electronic Engineeringen_US
dc.description.degreeBachelor's degreeen_US
dc.contributor.supervisoremailHCDang@ntu.edu.sgen_US
dc.subject.keywords3D mappingen_US
dc.subject.keywordsDepth-sensing cameraen_US
dc.subject.keywordsPoint-cloud-libraryen_US
item.grantfulltextrestricted-
item.fulltextWith Fulltext-
Appears in Collections:EEE Student Reports (FYP/IA/PA/PI)
Files in This Item:
File Description SizeFormat 
deryl fyp.pdf
  Restricted Access
2.62 MBAdobe PDFView/Open

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.