Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/137570
Title: Do humans process data like Stata? An experimental study
Authors: Chan, Yi Rong
Goh, Yun Sheen
Pei, Jiaoying
Keywords: Social sciences::Statistics
Issue Date: 2020
Publisher: Nanyang Technological University
Project: HE_1AY1920_6
Abstract: Least square (LS) learning model is one of the most seminal models on how individuals can learn a rational expectation equilibrium (REE) if they do not initially start from there. According to this model, agents estimate the data generating process (DGP) of the market price using the ordinary least square (OLS) model in an iterated way. In this paper, we test whether and how agents converge to REE in the lab, and replace the prediction task in the Learning to Forecast Experiment (LtFE) from point prediction to parameters in the DGP. About 17% of the individual predictions can be categorised to follow the LS learning rule, though there is a lack of evidence indicating the adoption at the aggregate level. We also design two treatments to investigate the effect of the spread of the independent variable on the speed of learning. Our results show that the speed of learning and the occurrence of convergence is much higher when the spread of the independent variable (“weather”) of the DGP is larger. In accordance with econometric theory, we also find a smaller variance in the treatment with wider spread using an experimental approach, though dispersion between the two treatments is not statistically significant.
URI: https://hdl.handle.net/10356/137570
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:SSS Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
HE_1AY1920_6.pdf
  Restricted Access
1.27 MBAdobe PDFView/Open

Page view(s)

188
Updated on May 9, 2021

Download(s) 50

72
Updated on May 9, 2021

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.