Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/184196
Title: Controllable human motion generation
Authors: Yeo, Jia Ying
Keywords: Computer and Information Science
Issue Date: 2025
Publisher: Nanyang Technological University
Source: Yeo, J. Y. (2025). Controllable human motion generation. Final Year Project (FYP), Nanyang Technological University, Singapore. https://hdl.handle.net/10356/184196
Abstract: Controllability is crucial in text-to-motion generation because it allows users to specify precise motion details, such as the trajectory of a limb or the orientation of a body part. This level of control is vital for applications in fields like gaming, where unique character movements enhance user experience, or in animation, where users can manipulate motions to suit narrative or aesthetic needs. However, existing models often face limitations in balancing semantic fidelity, control, and efficiency. For instance, while some methods offer fine-grained control over joint movements or body parts, they may require extensive computation, reducing their practicality for real-time applications. This project examines the impact of various samplers on improving inference speed for OmniControl, a controllable motion generation approach. Additionally, a graphical user interface has been developed as a functional prototype.
URI: https://hdl.handle.net/10356/184196
Schools: College of Computing and Data Science 
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:CCDS Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
SCSE23-0314_Amended_Final_Report_v2.pdf
  Restricted Access
1.17 MBAdobe PDFView/Open

Page view(s)

35
Updated on May 7, 2025

Download(s)

2
Updated on May 7, 2025

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.