Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/85539
Full metadata record
DC FieldValueLanguage
dc.contributor.authorSesagiri Raamkumar, Aravinden
dc.contributor.authorFoo, Schuberten
dc.contributor.authorPang, Natalieen
dc.contributor.editorLewandowski, Dirken
dc.date.accessioned2017-09-18T05:40:13Zen
dc.date.accessioned2019-12-06T16:05:36Z-
dc.date.available2017-09-18T05:40:13Zen
dc.date.available2019-12-06T16:05:36Z-
dc.date.issued2017en
dc.identifier.citationSesagiri Raamkumar, A., Foo, S., & Pang, N. (2017). User evaluation of a task for shortlisting papers from researcher’s reading list for citing in manuscripts. Aslib Journal of Information Management, in press.en
dc.identifier.issn2050-3806en
dc.identifier.urihttps://hdl.handle.net/10356/85539-
dc.description.abstractPurpose: Although many interventional approaches have been proposed to address the apparent gap between novices and experts for literature review (LR) search tasks, there have been very few approaches proposed for manuscript preparation (MP) related tasks. This paper describes a task and an incumbent technique for shortlisting important and unique papers from the reading list of researchers, meant for citation in a manuscript. Design/methodology/approach: A user evaluation study was conducted on the prototype system which was built for supporting the shortlisting papers (SP) task along with two other LR search tasks. A total of 119 researchers who had experience in authoring research papers participated in this study. An online questionnaire was provided to the participants for evaluating the task. Both quantitative and qualitative analyses were performed on the collected evaluation data. Findings: Graduate research students prefer this task more than research and academic staff. The evaluation measures relevance, usefulness and certainty were identified as predictors for the output quality measure ‘good list’. The shortlisting feature and information cues were the preferred aspects while limited dataset and rote steps in the study were ascertained as critical aspects from the qualitative feedback of the participants. Originality/value: Findings point out that researchers are clearly interested in this novel task of shortlisting papers from the final reading list prepared during literature review. This has implications for digital library, academic databases and reference management software where this task can be included to benefit researchers at the manuscript preparatory stage of the research lifecycle.en
dc.description.sponsorshipNRF (Natl Research Foundation, S’pore)en
dc.format.extent24 p.en
dc.language.isoenen
dc.relation.ispartofseriesAslib Journal of Information Managementen
dc.rights© 2017 Emerald. This is the author created version of a work that has been peer reviewed and accepted for publication by Aslib Journal of Information Management, Emerald. It incorporates referee’s comments but changes resulting from the publishing process, such as copyediting, structural formatting, may not be reflected in this document. The published version is available at: [http://dx.doi.org/10.1108/AJIM-01-2017-0020].en
dc.subjectManuscript preparationen
dc.subjectShortlisting citationsen
dc.titleUser evaluation of a task for shortlisting papers from researcher’s reading list for citing in manuscriptsen
dc.typeJournal Articleen
dc.contributor.schoolWee Kim Wee School of Communication and Informationen
dc.identifier.doi10.1108/AJIM-01-2017-0020en
dc.description.versionAccepted versionen
item.fulltextWith Fulltext-
item.grantfulltextopen-
Appears in Collections:WKWSCI Journal Articles
Files in This Item:
File Description SizeFormat 
AJIM-2017-final-RG-sharing.pdfMain article963.73 kBAdobe PDFThumbnail
View/Open

Page view(s) 50

274
Updated on Apr 19, 2021

Download(s) 20

125
Updated on Apr 19, 2021

Google ScholarTM

Check

Altmetric


Plumx

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.