Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/154153
Title: Fairness in design : a tool for guidance for ethical artificial intelligence design
Authors: Shu, Ying
Keywords: Engineering::Computer science and engineering
Issue Date: 2021
Publisher: Nanyang Technological University
Source: Shu, Y. (2021). Fairness in design : a tool for guidance for ethical artificial intelligence design. Master's thesis, Nanyang Technological University, Singapore. https://hdl.handle.net/10356/154153
Abstract: As artificial intelligence (AI) becomes increasingly widely applied, societies have recognized the need for proper governance for its responsible usage. An important dimension of responsible AI is fairness. AI systems were once thought to be impartial and fair in their decisions, but studies have shown that biases and discrimination are able to creep into the data and model to affect outcomes even causing harm. Due to the multi-faceted nature of the notions of fairness, it is challenging for AI solution designers to envision potential fairness issues at the design stage. Furthermore, there are currently limited methodologies available for them to incorporate fairness values into their designs. In this thesis, we present the Fairness in Design (FID) methodology and tool that aim to address the gap. It is available in both physical and online format. The tool provides AI solution designers with a workflow that allows them to surface fairness concerns, navigate complex ethical choices around fairness, and overcome blind spots and team biases. We have tested the methodology on 10 AI design teams (n = 24) and the results are supportive of our hypotheses. Not only 67% of the participants would recommend our physical methodology tool to their friend or colleague, but also 79% of the participants indicated that they are interested in using the tool in their future projects. This tool has the potential to add value to the ethical AI field and can be expanded to support other ethical AI dimensions such as privacy preservation and explainability.
URI: https://hdl.handle.net/10356/154153
DOI: 10.32657/10356/154153
Rights: This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License (CC BY-NC 4.0).
Fulltext Permission: open
Fulltext Availability: With Fulltext
Appears in Collections:SCSE Theses

Files in This Item:
File Description SizeFormat 
NTU_Thesis.pdf7.75 MBAdobe PDFView/Open

Page view(s)

35
Updated on Jan 23, 2022

Download(s)

22
Updated on Jan 23, 2022

Google ScholarTM

Check

Altmetric


Plumx

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.