Please use this identifier to cite or link to this item:
Title: Vehicle speed prediction with LSTM from trajectories
Authors: Srikanth, Samhita Kadayam
Keywords: Engineering::Computer science and engineering::Computing methodologies::Artificial intelligence
Issue Date: 2022
Publisher: Nanyang Technological University
Source: Srikanth, S. K. (2022). Vehicle speed prediction with LSTM from trajectories. Final Year Project (FYP), Nanyang Technological University, Singapore.
Abstract: This project was undertaken with the objective of predicting a vehicle’s velocity using Trajectory data on an LSTM. The average velocity of a vehicle was calculated based on distance covered in time, and this velocity was interpolated into a continuous form, to predict continuous data. This project uses a Neural Networks based model, which takes in continuous data on a vehicle’s velocity and outputs its next velocity within every 10 second interval. This was implemented using TensorFlow, Python, and a Stacked LSTM (Long Short Term Memory) model. The data was from a Navigation system dataset. The time span is July 3-9, 2017, which contains five workdays (Jul 03-07) and two weekends (Jul 08-09). The pre-processing of the dataset involved cleaning the data, then restructuring it into a continuous format with 1 second intervals. Following this, the data was split into a 9 input and 1 output format. The data was then clustered using the K-Means method, and this was used to train and test the model. This paper will focus on the design and implementation of the data, the model, as well as the challenges encountered during this process.
Schools: School of Electrical and Electronic Engineering 
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:EEE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
  Restricted Access
1.26 MBAdobe PDFView/Open

Page view(s)

Updated on Dec 10, 2023


Updated on Dec 10, 2023

Google ScholarTM


Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.