Researchers' judgment criteria of high-quality answers on academic social Q&A platforms
Purpose: Through a two-stage survey, this paper examines how researchers judge the quality of answers on ResearchGate Q&A, an academic social networking site. Design/methodology/approach: In the first-stage survey, 15 researchers from Library and Information Science (LIS) judged the quality of 157 answers to 15 questions and reported the criteria that they had used. The content of their reports was analyzed, and the results were merged with relevant criteria from the literature to form the second-stage survey questionnaire. This questionnaire was then completed by researchers recognized as accomplished at identifying high-quality LIS answers on ResearchGate Q&A. Findings: Most of the identified quality criteria for academic answers—such as relevance, completeness, and verifiability—have previously been found applicable to generic answers. The authors also found other criteria, such as comprehensiveness, the answerer's scholarship, and value-added. Providing opinions was found to be the most important criterion, followed by completeness and value-added. Originality/value: The findings here show the importance of studying the quality of answers on academic social Q&A platforms and reveal unique considerations for the design of such systems.
Year of publication: |
2020
|
---|---|
Authors: | Li, Lei ; Zhang, Chengzhi ; He, Daqing ; Du, Jia Tina |
Published in: |
Online Information Review. - Emerald, ISSN 1468-4527, ZDB-ID 2014462-3. - Vol. 44.2020, 3 (14.02.), p. 603-623
|
Publisher: |
Emerald |
Saved in:
Online Resource
Saved in favorites
Similar items by person
-
Characterizing peer-judged answer quality on academic Q&A sites
Li, Lei, (2018)
-
Detecting Users' Dietary Preferences and Their Evolutions via Chinese Social Media
Zhou, Qingqing, (2018)
-
Examining differences among book reviews from various online platforms
Zhang, Chengzhi, (2019)
- More ...