lab.js: A free, open, online study builder.
Experiment
JavaScript
Online data collection
Open science
Open source
Software
Journal
Behavior research methods
ISSN: 1554-3528
Titre abrégé: Behav Res Methods
Pays: United States
ID NLM: 101244316
Informations de publication
Date de publication:
04 2022
04 2022
Historique:
pubmed:
30
7
2021
medline:
30
4
2022
entrez:
29
7
2021
Statut:
ppublish
Résumé
Web-based data collection is increasingly popular in both experimental and survey-based research because it is flexible, efficient, and location-independent. While dedicated software for laboratory-based experimentation and online surveys is commonplace, researchers looking to implement experiments in the browser have, heretofore, often had to manually construct their studies' content and logic using code. We introduce lab.js, a free, open-source experiment builder that makes it easy to build studies for both online and in-laboratory data collection. Through its visual interface, stimuli can be designed and combined into a study without programming, though studies' appearance and behavior can be fully customized using HTML, CSS, and JavaScript code if required. Presentation and response times are kept and measured with high accuracy and precision heretofore unmatched in browser-based studies. Experiments constructed with lab.js can be run directly on a local computer and published online with ease, with direct deployment to cloud hosting, export to web servers, and integration with popular data collection platforms. Studies can also be shared in an editable format, archived, re-used and adapted, enabling effortless, transparent replications, and thus facilitating open, cumulative science. The software is provided free of charge under an open-source license; further information, code, and extensive documentation are available from https://lab.js.org/ .
Identifiants
pubmed: 34322854
doi: 10.3758/s13428-019-01283-5
pii: 10.3758/s13428-019-01283-5
pmc: PMC9046347
doi:
Types de publication
Journal Article
Research Support, Non-U.S. Gov't
Langues
eng
Sous-ensembles de citation
IM
Pagination
556-573Informations de copyright
© 2021. The Author(s).
Références
Arslan, R.C., Walther, M.P., & Tata, C.S. (2020). Formr: A study framework allowing for automated feedback generation and complex longitudinal experience-sampling studies using R. Behavior Research Methods, 52, 376-387. https://doi.org/10.3758/s13428-019-01236-y
doi: 10.3758/s13428-019-01236-y
pubmed: 30937847
Barnhoorn, J.S., Haasnoot, E., Bocanegra, B.R., & Steenbergen, H. v. (2015). QRTEngine: An easy solution for running online reaction time experiments using Qualtrics. Behavior Research Methods, 47 (4), 918–929. https://doi.org/10.3758/s13428-014-0530-7
doi: 10.3758/s13428-014-0530-7
pubmed: 25407763
Birnbaum, M.H. (2000). SurveyWiz and FactorWiz: JavaScript Web pages that make HTML forms for research on the Internet. Behavior Research Methods, Instruments and Computers, 32(2), 339–346. https://doi.org/10.3758/BF03207804
doi: 10.3758/BF03207804
Brand, A., & Bradley, M.T. (2012). Assessing the effects of technical variance on the statistical outcomes of Web experiments measuring response times. Social Science Computer Review, 30(3), 350–357. https://doi.org/10.1177/0894439311415604
doi: 10.1177/0894439311415604
Buchanan, T., & Reips, U.-D. (2001). Platform-dependent biases in online research: Do Mac users really think different? In K.J. Jonas, P. Breuer, B. Schauenburg, & M. Boos (Eds.) Perspectives on internet research: Concepts and methods. Retrieved December 16, 2018, from http://www.unikonstanz.de/iscience/reips/pubs/papers/Buchanan_Reips2001.pdf .
Crump, M.J.C., McDonnell, J.V., & Gureckis, T.M. (2013). Evaluating Amazon’s Mechanical Turk as a tool for experimental behavioral research. PLOS One, 8(3), e57410. https://doi.org/10.1371/journal.pone.0057410
doi: 10.1371/journal.pone.0057410
pubmed: 23516406
pmcid: 3596391
Damian, M.F. (2010). Does variability in human performance outweigh imprecision in response devices such as computer keyboards? Behavior Research Methods, 42(1), 205–211. https://doi.org/10.3758/BRM.42.1.205
doi: 10.3758/BRM.42.1.205
pubmed: 20160300
de Leeuw, J.R. (2014). jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods, 1(47), 1–12. https://doi.org/10.3758/s13428-014-0458-y
doi: 10.3758/s13428-014-0458-y
de Leeuw, J.R., & Motz, B.A. (2015). Psychophysics in a Web browser? Comparing response times collected with JavaScript and Psychophysics Toolbox in a visual search task. Behavior Research Methods, 48(1), 1–12. https://doi.org/10.3758/s13428-015-0567-2
doi: 10.3758/s13428-015-0567-2
Garaizar, P., & Vadillo, M.A. (2014). Accuracy and precision of visual stimulus timing in PsychoPy: No timing errors in standard usage. PLOS One, 9(11), e112033. https://doi.org/10.1371/journal.pone.0112033
doi: 10.1371/journal.pone.0112033
pubmed: 25365382
pmcid: 4218832
Garaizar, P., Vadillo, M.A., López-de-Ipiña, D., & Matute, H. (2014). Measuring software timing errors in the presentation of visual stimuli in cognitive neuroscience experiments. PLOS One, 9(1), e85108. https://doi.org/10.1371/journal.pone.0085108
doi: 10.1371/journal.pone.0085108
pubmed: 24409318
pmcid: 3883681
Göritz, A.S., & Birnbaum, M.H. (2005). Generic HTML Form Processor: A versatile PHP script to save Web-collected data into a MySQL database. Behavior Research Methods, 37(4), 703–710. https://doi.org/10.3758/BF03192743
doi: 10.3758/BF03192743
pubmed: 16629305
Henninger, F., Schuckart, M. M., & Arslan, R.C. (2019). Who said browser-based experiments can’t have proper timing? Manuscript in preparation.
Hilbig, B.E. (2016). Reaction time effects in lab- versus Web-based research: Experimental evidence. Behavior Research Methods, 48(4), 1718–1724. https://doi.org/10.3758/s13428-015-0678-9
doi: 10.3758/s13428-015-0678-9
pubmed: 26542972
Ince, D.C., Hatton, L., & Graham-Cumming, J. (2012). The case for open computer programs. Nature, 482, 485–488. https://doi.org/10.1038/nature10836
doi: 10.1038/nature10836
pubmed: 22358837
Lange, K., Kühn, S., & Filevich, E. (2015). Just another tool for online studies (JATOS): An easy solution for setup and management of web servers supporting online studies. PLOS One, 10(6), e0130834. https://doi.org/10.1371/journal.pone.0130834
doi: 10.1371/journal.pone.0130834
pubmed: 26114751
pmcid: 4482716
Leiner, D.J. (2014). SoSci Survey. Retrieved from https://www.soscisurvey.com .
Limesurvey GmbH (2018). LimeSurvey: An open source survey tool. Retrieved from http://www.limesurvey.org .
Lincoln, C.E., & Lane, D.M. (1980). Reaction time measurement errors resulting from the use of CRT displays. Behavior Research Methods and Instrumentation, 12(1), 55–57. https://doi.org/10.3758/BF03208326
doi: 10.3758/BF03208326
MacLeod, C.M. (1991). Half a century of research on the Stroop effect: An integrative review. Psychological Bulletin, 109(2), 163–203. https://doi.org/10.1037/0033-2909.109.2.163
doi: 10.1037/0033-2909.109.2.163
pubmed: 2034749
Mathôt, S., Schreij, D., & Theeuwes, J. (2012). OpenSesame: An open-source, graphical experiment builder for the social sciences. Behavior Research Methods, 44(2), 314–324. https://doi.org/10.3758/s13428-011-0168-7
doi: 10.3758/s13428-011-0168-7
pubmed: 22083660
Nelson, L.D., Simmons, J., & Simonsohn, U (2017). Psychology’s renaissance. Annual Review of Psychology. https://doi.org/10.1146/annurev-psych-122216-011836 .
Nielsen, M. (2011) Reinventing discovery. Princeton: University Press.
doi: 10.2307/j.ctt7s4vx
Peirce, J.W. (2007). PsychoP—Psychophysics software in Python. Journal of Neuroscience Methods, 162(1–2), 8–13. https://doi.org/10.1016/j.jneumeth.2006.11.017
doi: 10.1016/j.jneumeth.2006.11.017
pubmed: 17254636
pmcid: 2018741
Plant, R.R., & Turner, G. (2009). Millisecond precision psychological research in a world of commodity computers: New hardware, new problems? Behavior Research Methods, 41(3), 598–614. https://doi.org/10.3758/BRM.41.3.598
doi: 10.3758/BRM.41.3.598
pubmed: 19587169
Qualtrics. (2016) Qualtrics. Provo: Qualtrics. Retrieved from: https://www.qualtrics.com .
Reimers, S., & Stewart, N. (2014). Presentation and response timing accuracy in Adobe Flash and HTML5/JavaScript Web experiments. Behavior Research Methods, 47(2), 309–327. https://doi.org/10.3758/s13428-014-0471-1
doi: 10.3758/s13428-014-0471-1
pmcid: 4427652
Reips, U. -D. (2002). Internet-based psychological experimenting: Five dos and five don’ts. Social Science Computer Review, 20(3), 241–249. https://doi.org/10.1177/089443930202000302
doi: 10.1177/089443930202000302
Reips, U.-D. (2007). The methodology of Internet-based experiments. In A.N. Joinson, K.Y.A. McKenna, T. Postmes, & U.-D. Reips (Eds.) The Oxford Handbook of Internet Psychology (pp. 373–390). Oxford: University Press.
Reips, U.-D., & Neuhaus, C. (2002). WEXTOR: A Web-based tool for generating and visualizing experimental designs and procedures. Behavior Research Methods, Instruments and Computers, 34(2), 234–240. https://doi.org/10.3758/BF03195449
doi: 10.3758/BF03195449
Semmelmann, K., & Weigelt, S. (2017a). Online psychophysics: Reaction time effects in cognitive experiments. Behavior Research Methods, 49(4), 1241–1260. https://doi.org/10.3758/s13428-016-0783-4
doi: 10.3758/s13428-016-0783-4
pubmed: 27496171
Semmelmann, K., & Weigelt, S. (2017b). Online webcam-based eye tracking in cognitive science: A first look. Behavior Research Methods, 1–15. https://doi.org/10.3758/s13428-017-0913-7 .
Shevchenko, Y., & Henninger, F (2019). Open Lab: A web application for running and sharing online experiments. Manuscript in preparation.
Simcox, T., & Fiez, J.A. (2014). Collecting response times using Amazon Mechanical Turk and Adobe Flash. Behavior Research Methods, 46(1), 95–111. https://doi.org/10.3758/s13428-013-0345-y
doi: 10.3758/s13428-013-0345-y
pubmed: 23670340
pmcid: 5283577
Sochat, V.V. (2018). The experiment factory: Reproducible experiment containers. The Journal of Open Source Software. https://doi.org/10.21105/joss.00521 .
Sochat, V.V., Eisenberg, I.W., Enkavi, A.Z., Li, J., Bissett, P.G., & Poldrack, R.A. (2016). The experiment factory: Standardizing behavioral experiments. Frontiers in Psychology, 7. https://doi.org/10.3389/fpsyg.2016.00610 .
Stewart, N., Chandler, J., & Paolacci, G. (2017). Crowdsourcing samples in cognitive science. Trends in Cognitive Sciences, 21(10), 736–748. https://doi.org/10.1016/j.tics.2017.06.007
doi: 10.1016/j.tics.2017.06.007
pubmed: 28803699
Stoet, G. (2017). PsyToolkit: A novel Web-based method for running online questionnaires and reaction-time experiments. Teaching of Psychology, 44(1), 24–31. https://doi.org/10.1177/0098628316677643
doi: 10.1177/0098628316677643
Strauss, E., Sherman, E.M.S., & Spreen, O. (2006) A compendium of neuropsychological tests: Administration, norms and commentary. Oxford: University Press.
Stroop, J.R. (1935). Studies of interference in serial verbal reactions. Journal of Experimental Psychology, 18(6), 643–662. https://doi.org/10.1037/h0054651
doi: 10.1037/h0054651
The World Bank (2016). Individuals using the internet. Data retrieved from World Development Indicators, https://data.worldbank.org/indicator/IT.NET.USER.ZS .
Ulrich, R., & Giray, M. (1989). Time resolution of clocks: Effects on reaction time measurement—Good news for bad clocks. British Journal of Mathematical and Statistical Psychology, 42(1), 1–12. https://doi.org/10.1111/j.2044-8317.1989.tb01111.x
doi: 10.1111/j.2044-8317.1989.tb01111.x
von Bastian, C.C., Locher, A., & Ruflin, M. (2013). Tatool: A Java-based open-source programming framework for psychological studies. Behavior Research Methods, 45(1), 108–115. https://doi.org/10.3758/s13428-012-0224-y
doi: 10.3758/s13428-012-0224-y
pubmed: 22723043
Wilson, G., Aruliah, D.A., Brown, C.T., Hong, N.P.C., Davis, M., Guy, R.T., & Wilson, P. (2014). Best practices for scientific computing. PLOS Biology, 12(1), e1001745. https://doi.org/10.1371/journal.pbio.1001745
doi: 10.1371/journal.pbio.1001745
pubmed: 24415924
pmcid: 3886731
Woods, A.T., Velasco, C., Levitan, C.A., Wan, X., & Spence, C. (2015). Conducting perception research over the Internet: A tutorial review. PeerJ, 3, e1058. https://doi.org/10.7717/peerj.1058
doi: 10.7717/peerj.1058
pubmed: 26244107
pmcid: 4517966