Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/77157
Full metadata record
DC FieldValueLanguage
dc.contributor.authorGuo, Yuewen
dc.date.accessioned2019-05-14T07:52:05Z
dc.date.available2019-05-14T07:52:05Z
dc.date.issued2019
dc.identifier.urihttp://hdl.handle.net/10356/77157
dc.description.abstractWith the development of AI technology, more and more decisions are made by algorithms instead of human beings. On the one hand, machines can greatly increase working efficiency and accuracy. On the other hand, the algorithms can be designed to be more fair and ob- jective. Human beings may be subjective or even having discrimination during decision making process, but with a well designed algorithm, more fair decisions can be made. In this project, we only focus on one method to mitigate discrimination, data pre-processing method. Necessarily, the definitions of fairness and sources of discrimination are discussed before the introduction of algorithms. One of the most comprehensive algorithms, Opti- mised Pre-processing method has been examined with experiments, and 5 most commonly used machine learning classification models have been built to validate the algorithm’s bias mitigation performance.en_US
dc.format.extent34 p.en_US
dc.language.isoenen_US
dc.subjectDRNTU::Science::Mathematicsen_US
dc.titleFairness analysis in algorithm designen_US
dc.typeFinal Year Project (FYP)en_US
dc.contributor.supervisorBei Xiaohuien_US
dc.contributor.schoolSchool of Physical and Mathematical Sciencesen_US
dc.description.degreeBachelor of Science in Mathematical Sciencesen_US
item.grantfulltextrestricted-
item.fulltextWith Fulltext-
Appears in Collections:SPMS Student Reports (FYP/IA/PA/PI)
Files in This Item:
File Description SizeFormat 
FYP_Report.pdf
  Restricted Access
974.45 kBAdobe PDFView/Open

Page view(s)

331
Updated on Sep 19, 2024

Download(s) 50

46
Updated on Sep 19, 2024

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.