The impacts of learning analytics and A/B testing research: a case study in differential scientometrics.

A/B testing Learning analytics Online learning STEM education platform Scientometrics

Journal

International journal of STEM education
ISSN: 2196-7822
Titre abrégé: Int J STEM Educ
Pays: Germany
ID NLM: 101738908

Informations de publication

Date de publication:
2022
Historique:
received: 06 07 2021
accepted: 21 01 2022
entrez: 23 2 2022
pubmed: 24 2 2022
medline: 24 2 2022
Statut: ppublish

Résumé

In recent years, research on online learning platforms has exploded in quantity. More and more researchers are using these platforms to conduct A/B tests on the impact of different designs, and multiple scientific communities have emerged around studying the big data becoming available from these platforms. However, it is not yet fully understood how each type of research influences future scientific discourse within the broader field. To address this gap, this paper presents the first scientometric study on how researchers build on the contributions of these two types of online learning platform research (particularly in STEM education). We selected a pair of papers (one using A/B testing, the other conducting learning analytics (LA), on platform data of an online STEM education platform), published in the same year, by the same research group, at the same conference. We then analyzed each of the papers that cited these two papers, coding from the paper text (with inter-rater reliability checks) the reason for each citation made. After statistically comparing the frequency of each category of citation between papers, we found that the A/B test paper was self-cited more and that citing papers built on its work directly more frequently, whereas the LA paper was more often cited without discussion. Hence, the A/B test paper appeared to have had a larger impact on future work than the learning analytics (LA) paper, even though the LA paper had a higher count of total citations with a lower degree of self-citation. This paper also established a novel method for understanding how different types of research make different contributions in learning analytics, and the broader online learning research space of STEM education.

Sections du résumé

BACKGROUND BACKGROUND
In recent years, research on online learning platforms has exploded in quantity. More and more researchers are using these platforms to conduct A/B tests on the impact of different designs, and multiple scientific communities have emerged around studying the big data becoming available from these platforms. However, it is not yet fully understood how each type of research influences future scientific discourse within the broader field. To address this gap, this paper presents the first scientometric study on how researchers build on the contributions of these two types of online learning platform research (particularly in STEM education). We selected a pair of papers (one using A/B testing, the other conducting learning analytics (LA), on platform data of an online STEM education platform), published in the same year, by the same research group, at the same conference. We then analyzed each of the papers that cited these two papers, coding from the paper text (with inter-rater reliability checks) the reason for each citation made.
RESULTS RESULTS
After statistically comparing the frequency of each category of citation between papers, we found that the A/B test paper was self-cited more and that citing papers built on its work directly more frequently, whereas the LA paper was more often cited without discussion.
CONCLUSIONS CONCLUSIONS
Hence, the A/B test paper appeared to have had a larger impact on future work than the learning analytics (LA) paper, even though the LA paper had a higher count of total citations with a lower degree of self-citation. This paper also established a novel method for understanding how different types of research make different contributions in learning analytics, and the broader online learning research space of STEM education.

Identifiants

pubmed: 35194544
doi: 10.1186/s40594-022-00330-6
pii: 330
pmc: PMC8853091
doi:

Types de publication

Journal Article

Langues

eng

Pagination

16

Informations de copyright

© The Author(s) 2022.

Déclaration de conflit d'intérêts

Competing interestsThe authors declare that they have no competing interests in this research.

Références

Science. 1927 Oct 28;66(1713):385-9
pubmed: 17782476
Int J STEM Educ. 2018;5(1):12
pubmed: 30631702
Proc Natl Acad Sci U S A. 2020 Jun 30;117(26):14900-14905
pubmed: 32541050
Int J STEM Educ. 2018;5(1):16
pubmed: 30631706
Science. 1972 Oct 27;178(4059):368-75
pubmed: 17815351
Biometrics. 1977 Mar;33(1):159-74
pubmed: 843571
Educ Technol Res Dev. 2021;69(3):1405-1431
pubmed: 34075283
Int J STEM Educ. 2018;5(1):13
pubmed: 30631703
Cogn Sci. 2012 Jul;36(5):757-98
pubmed: 22486653
Int J STEM Educ. 2018;5(1):15
pubmed: 30631705
Science. 2015 Jan 2;347(6217):34-5
pubmed: 25554779
Science. 2013 Nov 22;342(6161):935-7
pubmed: 24264979

Auteurs

Ryan S Baker (RS)

Graduate School of Education, University of Pennsylvania, Philadelphia, PA 19104 USA.

Nidhi Nasiar (N)

Graduate School of Education, University of Pennsylvania, Philadelphia, PA 19104 USA.

Weiyi Gong (W)

Graduate School of Education, University of Pennsylvania, Philadelphia, PA 19104 USA.

Chelsea Porter (C)

Graduate School of Education, University of Pennsylvania, Philadelphia, PA 19104 USA.

Classifications MeSH