Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/175276
Title: Deep learning and computer chess (Part 1): using neural networks for chess evaluation functions
Authors: U, Jeremy Keat
Keywords: Computer and Information Science
Issue Date: 2024
Publisher: Nanyang Technological University
Source: U, J. K. (2024). Deep learning and computer chess (Part 1): using neural networks for chess evaluation functions. Final Year Project (FYP), Nanyang Technological University, Singapore. https://hdl.handle.net/10356/175276
Abstract: This report presents the implementation of two different chess evaluation functions based on the Giraffe and DeepChess papers. In the first implementation, the evaluator network architecture from Giraffe’s evaluation function was adapted into a multiclass classifier designed to predict 7 classifications of Stockfish evaluations through supervised learning. Experiments were conducted to gauge the effectiveness of input feature representations and dropout regularisation. The second implementation, based on DeepChess, uses a different approach to evaluation, through comparison of two chess positions in a Siamese network and outputs which of the two has a more advantageous position, evaluating board positions through binary classification. The network was trained in a two-stage process with a combination of unsupervised and supervised learning. Experiments were conducted to observe the effect of freezing pretrained layer weights as well as changing layer activation functions to LeakyReLU.
URI: https://hdl.handle.net/10356/175276
Schools: School of Computer Science and Engineering 
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:SCSE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
FYP_Report_SCSE23-0341.pdf
  Restricted Access
1.4 MBAdobe PDFView/Open

Page view(s)

133
Updated on May 7, 2025

Download(s)

18
Updated on May 7, 2025

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.