Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/66550
Title: Developing a game using unity 3D and leap motion controller
Authors: Muhammad Salihan Zaol-Kefli
Keywords: DRNTU::Engineering::Computer science and engineering
Issue Date: 2016
Abstract: Games on personal computers have traditionally been played using the keyboard. However, with the popularity of mobile games and even Microsoft’s Kinect, perhaps using gestures to play games on personal computers would improve the user experience. Hence, for this Final Year Project, the objective is to develop a game with the Unity3D game engine and use the Leap Motion controller to capture hand gestures and allow players to interact with the elements in the game. This report also gives an overview of the project schedule, the work breakdown structure and the level designs. There is also some elaboration on how the different components are put together for the game. After the implementation of the game was completed, a user study was conducted where ten people were approached to play and test the game as well as provide feedback. Useful information and data was gathered and it was concluded that using gestures to play the game indeed gave a higher degree of interaction and immersion as compared to just using the keyboard. Having said that, there were some complaints regarding some of the gestures and this shows that the right gestures are needed to truly provide a positive user experience. The gestures chosen should be simple, easy to learn and use as well as not cause discomfort and feel natural. In addition, most of the participants of the user study still prefer playing games with the keyboard because they are used to it. This implies that it will take time for the use of gestures for games on personal computers to catch on. Finally, this project can be taken as a foundation for further exploration into the area of user experience in games. In February, Leap Motion revealed a new project called Orion which promises greater accuracy in gesture capture and hardware is being developed to be mounted on Virtual Reality headsets such as Occulus Rift. This project has already set up the first person perspective and some gestures to control movement so any continuation of this project can reuse components that have been implemented.
URI: http://hdl.handle.net/10356/66550
Rights: Nanyang Technological University
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:SCSE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
FYP Final Report (Muhammad Salihan Zaol-kefli, U1221712J).pdf
  Restricted Access
2.21 MBAdobe PDFView/Open

Page view(s) 10

288
Updated on Nov 25, 2020

Download(s) 10

50
Updated on Nov 25, 2020

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.