Please use this identifier to cite or link to this item:
https://hdl.handle.net/10356/77157
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Guo, Yuewen | |
dc.date.accessioned | 2019-05-14T07:52:05Z | |
dc.date.available | 2019-05-14T07:52:05Z | |
dc.date.issued | 2019 | |
dc.identifier.uri | http://hdl.handle.net/10356/77157 | |
dc.description.abstract | With the development of AI technology, more and more decisions are made by algorithms instead of human beings. On the one hand, machines can greatly increase working efficiency and accuracy. On the other hand, the algorithms can be designed to be more fair and ob- jective. Human beings may be subjective or even having discrimination during decision making process, but with a well designed algorithm, more fair decisions can be made. In this project, we only focus on one method to mitigate discrimination, data pre-processing method. Necessarily, the definitions of fairness and sources of discrimination are discussed before the introduction of algorithms. One of the most comprehensive algorithms, Opti- mised Pre-processing method has been examined with experiments, and 5 most commonly used machine learning classification models have been built to validate the algorithm’s bias mitigation performance. | en_US |
dc.format.extent | 34 p. | en_US |
dc.language.iso | en | en_US |
dc.subject | DRNTU::Science::Mathematics | en_US |
dc.title | Fairness analysis in algorithm design | en_US |
dc.type | Final Year Project (FYP) | en_US |
dc.contributor.supervisor | Bei Xiaohui | en_US |
dc.contributor.school | School of Physical and Mathematical Sciences | en_US |
dc.description.degree | Bachelor of Science in Mathematical Sciences | en_US |
item.grantfulltext | restricted | - |
item.fulltext | With Fulltext | - |
Appears in Collections: | SPMS Student Reports (FYP/IA/PA/PI) |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
FYP_Report.pdf Restricted Access | 974.45 kB | Adobe PDF | View/Open |
Page view(s)
331
Updated on Sep 19, 2024
Download(s) 50
46
Updated on Sep 19, 2024
Google ScholarTM
Check
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.