Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/138044
Title: Deep learning for snake pattern detection
Authors: Ching, Jia Chin
Keywords: Engineering::Computer science and engineering::Computing methodologies::Pattern recognition
Issue Date: 2020
Publisher: Nanyang Technological University
Project: SCSE19-0170
Abstract: Snakebites are a serious concern for many countries worldwide, especially for rural undeveloped countries. From snakebites alone, about a 100,000 people die every year in these countries and 3 times as many people experience lasting effects such as amputation and kidney failures. Our project, SnakeAlert, goal is to reduce snakebites and raise public awareness. This year, we focus on improving snakebites response times via early snake recognition. We shall use image recognition to quickly identify venomous snakes and direct victims to the nearest hospital containing the required antivenom. We used neural networks and machine learning techniques to train an A.I. to identify venomous snakes and achieved a 60% success rate at identify venomous snakes. This is a relatively high success rate & proves that image recognition technology can be applied to life saving snake recognition procedures. Furthermore, this technique is not yet optimised as it can be improved with a better dataset & neural network model.
URI: https://hdl.handle.net/10356/138044
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:SCSE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
Poster_Ching Jia Chin_Snake Pattern Detection using Image Recognition.pptx
  Restricted Access
524.85 kBMicrosoft PowerpointView/Open
Deep Learning for Snake Pattern Detection v2.0.pdf
  Restricted Access
2.07 MBAdobe PDFView/Open
alma991016506009405146.html
  Restricted Access
146 BHTMLView/Open

Page view(s)

245
Updated on Jun 27, 2022

Download(s) 50

29
Updated on Jun 27, 2022

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.