Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/140163
Title: iReceipt : an intelligent expenses tracker, based on receipt analysis and machine learning (A)
Authors: Young, Ying Jie
Keywords: Engineering::Electrical and electronic engineering
Issue Date: 2020
Publisher: Nanyang Technological University
Project: A3004-191
Abstract: Tracking expenses has always been a daunting and tedious task. Many people do not have the habit of tracking expenses simply because it is not worth the effort to take note of every purchase and record it down just to know how much one spends. However, research has shown that tracking one’s expenses can allow the individual to become more aware and careful on his/her future purchases as well as improve the person’s memory and thinking. In this Final Year Project (FYP), we propose to create a mobile application using the technology of Optical Character Recognition (OCR) to simplify the task of tracking expenses. Using the application, a user simply has to upload an image of his/her spending receipt and the necessary information such as the Merchant Name, Category of Spending and Spending Amount will be stored for tracking purposes. This way, users do not have to manually key in every amount for every purchase they have made but can just simply take a photograph of their receipts. In addition, users will be able to see their receipt images as well as visual representations of their spending in the form of a bar graph as well as a pie chart to better understand how much they should spend and what they should spend less on.
URI: https://hdl.handle.net/10356/140163
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:EEE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
FYP Final Report (Ying Jie).pdf
  Restricted Access
6.99 MBAdobe PDFView/Open

Page view(s)

224
Updated on Feb 7, 2023

Download(s)

7
Updated on Feb 7, 2023

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.