Núm. 13 (2021): Iniciativas para la evaluación internacional de los resultados de aprendizaje en la Formación Profesional y la Educación Superior
Monográfico

EVALUACIÓN DEL RENDIMIENTO Y ENTRENAMIENTO DIGITAL PARA EL RAZONAMIENTO CRÍTICO GENÉRICO Y ESPECÍFICO PARA JÓVENES PROFESIONALES EN LA FASE PRÁCTICA EN EL DERECHO, LA MEDICINA Y LA DOCENCIA

Olga Zlatkin-Troitschanskaia
Professor
Publicado 16 julio 2021

Palabras clave:

Competencias digitales, razonamiento crítico, educación postuniversitaria, herramientas de entrenamiento en línea, evaluación de rendimiento
Cómo citar
Zlatkin-Troitschanskaia, O., Brückner, S., Nagel, M.-T. ., Bültmann, A.-K., Fischer, J., Schmidt, S., & Molerov, D. (2021). EVALUACIÓN DEL RENDIMIENTO Y ENTRENAMIENTO DIGITAL PARA EL RAZONAMIENTO CRÍTICO GENÉRICO Y ESPECÍFICO PARA JÓVENES PROFESIONALES EN LA FASE PRÁCTICA EN EL DERECHO, LA MEDICINA Y LA DOCENCIA . Journal of Supranational Policies of Education, (13), 9–36. https://doi.org/10.15366/jospoe2021.13.001

Resumen

En la era digital, Internet se considera cada vez más como una importante fuente de información. Esto es especialmente cierto para el aprendizaje informal, como por ejemplo, la educación postuniversitaria. Evidentemente, los jóvenes profesionales utilizan cada vez más las fuentes en línea como herramienta de información y aprendizaje. El razonamiento crítico de información en línea para el aprendizaje y los procesos profesionales en la medicina, el derecho y la docencia se considera una faceta de competencia de gran relevancia. Por ejemplo, mantenerse actualizado sobre una multitud de asuntos tales como publicados en artículos y pautas, como es el caso en la medicina, puede ser un desafío cuando no están desarrolladas las competencias necesarias para utilizar los medios en línea (Allen et al. 2005, O'Carroll et al.2015). Las investigaciones actuales sobre estudiantes universitarios indican deficiencias sustanciales en sus habilidades de razonamiento crítico en línea, también entre los graduados. Sin embargo, todavía no se han investigado la búsqueda de información en línea y las competencias correspondientes entre los jóvenes profesionales en la fase práctica. Hay una falta tanto de evaluaciones válidas específicas para diferentes profesiones como de herramientas de entrenamiento que puedan fomentar de manera efectiva el uso competente de la información en línea entre los jóvenes profesionales en la práctica.
Nuestra investigación es parte del proyecto colaborativo BRIDGE, que hace parte del programa "Investigación para el Diseño de Procesos Educativos en las Condiciones del Cambio Digital". Este estudio se basa en nuestro trabajo anterior sobre la examinación de habilidades genéricas en la educación superior en los proyectos internacionales CLA+, iPAL y CORA, así como en experiencias con exámenes de rendimiento específicas para profesiones acumuladas en los programas de investigación KoKoHs y ASCOT+, que evaluaron competencias profesionales. Para examinar válidamente el razonamiento crítico en línea entre los jóvenes profesionales (en el derecho, la medicina y la docencia), desarrollamos nuevos exámenes de rendimiento en línea y entrenamientos digitales correspondientes. El objetivo es analizar en qué medida los jóvenes profesionales mejoran utilizando la información en línea con mayor reflexión mientras su preparación de documentos profesionales después de haber participado en un entrenamiento en línea. Utilizamos datos del proceso y del rendimiento (utilizando enfoques innovadores, como la minería de textos y la minería de datos educativos). En este artículo, mostramos el marco conceptual y de evaluación de los instrumentos innovadores recientemente desarrollados para medir y fomentar el razonamiento crítico en línea genérico y especializado entre los jóvenes profesionales en la fase práctica de la medicina, del derecho y de la docencia. Basado en este marco, discutimos cómo estas facetas de competencia profesional importantes se pueden medir de manera válida y fomentar de manera efectiva en la práctica.

Descargas

Los datos de descargas todavía no están disponibles.

Citas

Abreu, B. S. de, Mihailidis, P., Lee, A. Y. L., Melki, J., & McDougall, J. (Eds.). (2017). International handbook of media literacy education. New York, London: Routledge Taylor & Francis Group.

AERA, APA & NCME (2014). Standards for educational and psychological testing. AERA.

Ainley J., Fraillon, J. Schulz, W. & Gebhardt, E. (2016). Conceptualizing and Measuring Computer and Information Literacy in Cross-National Contexts. Applied Measurement in Education, 29(4), 291 – 309. https://doi.org/10.1080/08957347.2016.1209205

Alexander, P. A., Jablansky, S., Singer, L. M., & Dumas, D. (2016). Relational reasoning: What we know and why it matters. Policy insights from the behavioral and brain sciences, 3(1), 36-44. https://doi.org/10.1177/2372732215622029

Alexander, P. A. (2003). The Development of Expertise: The Journey From Acclimation to Proficiency. Educational Researcher, 32(8), 10-14. https://doi.org/10.3102/0013189X032008010

Allen, D. & Harkins, K.J. (2005). Too much guidance? The Lancet, 365(9473), 21-27.

Amin, J. (2016). Redefining the Role of Teachers in the Digital Era. International Journal of Indian Psychology, 3(3). https://doi.org/10.25215/0303.101

Askew, S. (2004). Feedback for Learning. Routledge.

Banerjee, M., Zlatkin-Troitschanskaia, O., & Roeper, J. (2020). Narratives and Their Impact on Students’ Information Seeking and Critical Online Reasoning in Higher Education Economics and Medicine. Frontiers in Education, 5. https://doi.org/10.3389/feduc.2020.570625

Basak, D., & Schimmel, R. (2008). Internet im Jurastudium – Plädoyer für einen wohlüberlegten Einsatz des WWW. Zeitschrift für das juristische Studium, 4(94), 435-440.

Benjes-Small, C., Archer, A., Tucker, K., Vassady, L. & Resor, J. (2013). Teaching Web Evaluation. Communications in Information Literacy, 7(1), 39-49.

Blakeslee, S. (2004). The CRAAP Test. LOEX Quarterly, 31(3), 6-7.

Bundesministerium für Bildung und Forschung (BMBF) (2015). Technologiebasierte Kompetenzmessung in der beruflichen Bildung (ASCOT) Ergebnisse und Bedeutung für Politik und Praxis. BMBF.

Braasch, J. L. G., & Bråten, I. (2017). The Discrepancy-Induced Source Comprehension (D-ISC) Model: Basic Assumptions and Preliminary Evidence. Educational Psychologist, 52(3), 167-181. https://doi.org/10.1080/00461520.2017.1323219

Branch, R.M. (2009). Instructional Design: The ADDIE-Approach. New York: Springer

Brand-Gruwel, S., Kammerer, Y., van Meeuwen, L., & van Gog, T. (2017). Source evaluation of domain experts and novices during Web search. Journal of Computer Assisted Learning, 33(3), 234-251. https://doi.org/10.1111/jcal.12162

Brand-Gruwel, S. & Wopereis, I. (2006). Integration of the information problem-solving skill in educational programme: The effects of learning with authentic tasks. Technology, Instruction, Cognition, and Learning, 4, 243-263.

Bransford, J. D., Brown, A. L., & Cocking, R. R. (2000). How People Learn: Brain, Mind, Experience, and School. National Academy Press.

Brooks, C. (2016). ECAR study of students and Information Technology. ECAR.

Brückner, S., & Pellegrino, J. W. (2016). Integrating the Analysis of Mental Operations Into Multilevel Models to Validate an Assessment of Higher Education Students’ Competency in Business and Economics. Journal of Educational Measurement, 53(3), 293-312. https://doi.org/10.1111/jedm.12113

Booth, C. (2011). Reflective teaching, effective learning: Instructional literacy for library educators. Chicago: American Library Association.

Cacioppo, J. T. & Petty, R. E. (1982). The need for cognition. Journal of Personality and Social Psychology, 42, 116–131.

Campbell, D. T., & Fiske, D. W. (1959). Convergent and discriminant validation by the multitrait-multimethod matrix. Psychological Bulletin, 56(2), 81-105.

Chiu, Y.-L., Liang, Y.-C., & Tsai, C.-C. (2013). Internet-specific epistemic beliefs and self-regulated learning in online academic information searching. Metacogn. Learn. 8, 235–260. doi: 10.1007/s11409-013-9103-x

Christennson, K. (2006). Radcab: Your vehicle for Information Evaluation. Fort Atkinson: Highsmith Inc.

Dabbagh, N. & Kitsantas, A. (2012). Personal Learning Environments, social media, and self-regulated learning: A natural formula for connecting formal and informal learning. The Internet and Higher Education, 15(1), 3-8.

Davey, T., Ferrara, S., Shavelson, R., Holland, P., Webb, N., & Wise, L. (2015). Psychometric considerations for the next generation of performance assessment. Washington, DC: Center for K-12 Assessment & Performance Management, Educational Testing Service. Retrieved from https://www.ets.org/Media/Research/pdf/psychometric_considerations_white_paper.pdf

Ercikan, K., & Pellegrino, J. W. (Eds.). (2017). NCME applications of educational measurement and assessment book series. Validation of score meaning for the next generation of assessments: The use of response processes. Routledge.

Fabry, G. (2016). Warum Hochschuldidaktik? Die Perspektive der Humanmedizin. Zeitschrift Für Didaktik Der Rechtswissenschaft, 3(2), 136-151. https://doi.org/10.5771/2196-7261-2016-2-136

Facione, P. (1990). The Delphi Report: Executive Summary; Critical thinking: A Statement of Expert Consensus for Purposes of Educational Assessment and Instruction. The California Academic Press.

Fischer, F. (Ed.). (2018). Scientific reasoning and argumentation: The roles of domain-specific and domain-general knowledge. New York NY: Routledge.

Frederick, S. (2005). Cognitive Reflection and Decision Making. Journal of Economic Perspectives, 19(4). 25–42. https://doi.org/10.1257/089533005775196732

Gadiraju, U., Yu, R., Dietze, S., & Holtz, P. (2018). Analyzing Knowledge Gain of Users in Informational Search Sessions on the Web. In C. Shah, N. J. Belkin, K. Byström, J. Huang, & F. Scholer (Eds.), CHIIR'18: Proceedings of the 2018 Conference on Human Information Interaction & Retrieval. (pp. 2-11). The Association for Computing Machinery. https://doi.org/10.1145/3176349.3176381

Gikandi, J.W., Morrow, D. & Davis, N.E. (2011). Online Formative Assessment in Higher Education: A Review of the Literature. Computers and Education, 57, 2333-2351.

http://dx.doi.org/10.1016/j.compedu.2011.06.004

Goldhammer, F., & Zehner, F. (2017). What to Make Of and How to Interpret Process Data. Measurement: Interdisciplinary Research & Perspective, 15(3-4), 128-132. https://doi.org/10.1080/15366367.2017.1411651

Hague, C., and Payton, S. (2010). Digital Literacy Across the Curriculum. Futurelab

Hargittai, E., Fullerton, L., Menchen-Trevino, E., & Thomas, K. Y. (2010). Trust Online: Young Adults’ Evaluation of Web Content. International Journal of Communication(4), 468-494.

Hattie, J. & Timperley, H. (2007). The Power of Feedback. Review of Educational Research. 77(1), 81–112.

Hatziapostolou, T. & Paraskakis, I. (2010). Enhancing the Impact of Formative Feedback on Student Learning through an Online Feedback System. Electronic Journal of e-Learning, 8(2), 111-122.

Hocevar, K. P., Flanagin, A. J., & Metzger, M. J. (2014). Social media self-efficacy and information evaluation online. Computers in Human Behavior, 39, 254-262. https://doi.org/10.1016/j.chb.2014.07.020

Hölscher, C., & Strube, G. (2000). Web search behaviour of Internet experts and newbies. Computing Networks, 33(1), 1–6. https://doi.org/10.1016/S1389-1286(00)00031-1

Hodell, C. (2007). Basics of Instructional Systems Development. Alexandria: ASTD.

Hubley, A. M., & Zumbo, B. D. (2011). Validity and the Consequences of Test Interpretation and Use. Social Indicators Research, 103(2), 219-230. https://doi.org/10.1007/s11205-011-9843-4

Jacob, O., Weiß, N. & Schweig, J. (2011). Konzeption und Gestaltung von Management Dashboards. Working Paper, Nr. 18, Hochschule für Angewandte Wissenschaften Neu-Ulm.

Jahn, D., & Kenner, A. (2018). Critical Thinking in Higher Education: How to foster it using Digital Media. In D. Kergel, B. Heidkamp, P. K. Telléus, T. Rachwal, & S. Nowakowski (Eds.), The Digital Turn in Higher Education (pp. 81–109). Wiesbaden: Springer Fachmedien Wiesbaden. https://doi.org/10.1007/978-3-658-19925-8_7

Ibabe, I. & Jauregizar, J. (2010). Online self-assessment with feedback and metacognitive knowledge. Higher Education, 59, 243–258

Kimmerle, J., Moskaliuk, J., Oeberst, A. & Cress, U. (2015). Learning and Collective Knowledge Construction With Social Media: A Process-Oriented Perspective. Educational Psychologist, 50(2), 120-137. https://doi.org/10.1080/00461520.2015.1036273

Kohmer, A. (2020). Entwicklung und Validierung eines Trainings zur Erfassung und Förderung des kritischen Umgangs mit Online-Medien. Masterthesis.

Kuhn, S., Müller, N., Kirchgässer, E., Ulzheimer, L. & Lucia Deutsch, K. (2020). Digital skills for medical students – qualitative evaluation of the curriculum 4.0 “Medicine in the digital age”. Journal for Medical Education, 37(6), Doc60. https://doi.org/10.3205/zma001353

Lachner, A., Burkhart, C. & Nückles, M. (2017). Formative computer-based feedback in the university classroom: Specific concept maps scaffold students‘ writing. Computers in Human Behavior, 72, 459–469.

Lauterbach, B. (2020). Konzeption und Entwicklung eines Feedbacksystems als Teil eines digitalen Lehr-Lernarrangements zur Förderung von Critical Online Reasoning (COR). Masterthesis.

Leighton, J. P. (2017). Using think-aloud interviews and cognitive labs in educational research. Understanding qualitative research. Oxford University Press

Li, Z., Banerjee, J., & Zumbo, B. D. (2017). Response Time Data as Validity Evidence: Has it lived up to its promise and, if not, what would it take to do so. In B. D. Zumbo & A. M. Hubley (Eds.), Understanding and Investigating Response Processes in Validation Research (pp. 159-178). Springer International Publishing.

Liepmann, D., Beauducel, A., Brocke, B., & Amthauer, R. (2007). Intelligenz-Struktur-Test 2000 R. Hogrefe.

List, A., & Alexander, P. A. (2017). Analyzing and Integrating Models of Multiple Text Comprehension. Educational Psychologist, 52(3), 143-147. https://doi.org/10.1080/00461520.2017.1328309

Maireder, A. & Nagl, M. (2010). Internet in der Schule, Schule im Internet. Schulische Kommunikationskultur in der Informationsgesellschaft. In mediamanual. Texte 2010, Nr.1. http://www2.mediamanual.at.

Mason, B. J. & Bruning, R. H. (2001). Providing feedback in computer-based instruction: What the research tells us. Center for Instructional Innovation.

Mathson, S. M. & Lorenzen, M. G. (2008). We won't be fooled again: teaching critical thinking via evaluation of hoax and historical revisionist Websites in a library credit course. College & Undergraduate Libraries, 15(1/2), 211-230.

Maurer, M., Schemer, C., Zlatkin-Troitschanskaia, O. & Jitomirski, J. (2020). Positive and Negative Media Effects on University Students’ Learning: Preliminary Findings and a Research Program. In O. Zlatkin-Troitschanskaia (Eds.), Frontiers and Advances in Positive Learning in the Age of Information (pp. 109–119). Springer. https://doi.org/10.1007/978-3-030-26578-6_8

Mayer, R. E. (2009). Multimedia learning (2nd ed.). Cambridge, England: Cambridge University Press.

Mathson, S. M. & Lorenzen, M. G. (2008). We won't be fooled again: teaching critical thinking via evaluation of hoax and historical revisionist Websites in a library credit course. College & Undergraduate Libraries, 15(1/2), 211-230.

Meredith, S. (2010). First year law students, legal research skills & electronic resources. The Law Teacher, 41(2), 191-205. https://doi.org/10.1080/03069400.2007.9959738

McGrew, S., Smith, M., Breakstone, J., Ortega, T. & Wineburg, S. (2019). Improving university students’ web savvy: An intervention study. British Journal of Educational Psychology, 89(3), 485-500. https://doi.org/10.1111/bjep.12279

McGrew, S., Breakstone, J., Ortega, T., Smith, M., & Wineburg, S. (2018). Can students evaluate online sources? Learning from assessments of civic online reasoning. Theory and Research in Social Education, 46(2), 165–193. https://doi.org/10.1080/00933104.2017.1416320

McGrew, S., Ortega, T., Breakstone, J., & Wineburg, S. (2017). The challenge that’s bigger than fake news. Civic reasoning in a social media environment. American Educator, 41(9), 4–9.

Mcquail, D. (1991). Media Performance Assessment in the Public Interest: Principles and Methods. Annals of the International Communication Association, 14(1), 111-145. https://doi.org/10.1080/23808985.1991.11678782

Mesko, B. & Gy?rffy, Z. (2019). The Rise of the Empowered Physician in the Digital Health Era: Viewpoint. Journal of Medical Internet Research, 21(3), e12490.

Merriënboer, J. J. G. van, Clark, R. E. & Croock, M. B. M. de (2002). Blueprints for complex learning: The 4C/ID-model. Educational Technology Research and Development, 50(2), 39-61.

Mielke, B. & Wolff, C. (2012). Ausbildungskonzepte zur Verbesserung juristischer Informationskompetenz. Vortrag präsentiert auf der IRIS Konferenz, Salzburg. https://www.researchgate.net/publication/236340879_Ausbildungskonzepze_ zur_Verbesserung_juristischer_Informationskompetenz.

Molerov, D., Zlatkin-Troitschanskaia, O., Nagel, M.T., Brückner, S., Schmidt, S. & Shavelson, R. (2020). Assessing University Students’ Critical Online Reasoning Ability: A Conceptual and Assessment Framework with Preliminary Evidence. Frontiers in Education, 5(1), 1-29. https://doi.org/10.3389/feduc.2020.577843

Molerov, D., Zlatkin-Troitschanskaia, O., and Schmidt, S. (2019). Adapting the civic online reasoning assessment cross-nationally using an explicit functional equivalence approach. In Annual Meeting of the American Educational Research Association (Toronto).

Munzner, T. (2009). A nested model for visualization design and validation. IEEE Transactions on Visualization and Computer Graphics, 15(6), 921-928. https://doi.org/10.1109/TVCG.2009.111.

Nagel, M.-T., Schäfer, S., Zlatkin-Troitschanskaia, O., Schemer, C., Maurer, M. & Molerov, D. (2020a). How do university students’ web search behavior, website characteristics, and the interaction of both influence students’ critical online reasoning? Frontiers in Education, 5(1). https://doi.org/ 10.3389/feduc.2020.565062

Nagel, M.-T., Zlatkin-Troitschanskaia, O., Schmidt, S., & Beck, K. (2020b). Performance Assessment of Generic and Domain-Specific Skills in Higher Education Economics. In O. Zlatkin-Troitschanskaia, H. A. Pant, M. Toepper & C. Lautenbach (Eds.), Student Learning in German Higher Education: Innovative Measurement Approaches and Research Results (p. 281–299). Wiesbaden: Springer VS. https://doi.org/10.1007/978-3-658-27886-1_14

National Research Council (2012). Education for Life and Work: Developing Transferable Knowledge and Skills in the 21st Century. National Academies Press

Niegemann, H. & Weinberger, A. (2020). Handbuch Bildungstechnologie: Konzeption und Einsatz digitaler Lernumgebungen. Wiesbaden: Springer.

Niegemann, H., Hessel, S., Hoschscheid-Mauel, D., Aslanski, K. & Deimann, M., Kreuzberger, G. (2013). Kompendium E-Learning. Berlin: Springer.

O’Carroll, A.M., Westby, E.P., Dooley, J. & Gordon, K.E. (2015). Information-Seeking Behaviors of Medical Students: A Cross-Sectional Web-Based Survey. Journal of Medical Internet Research Medical Education, 1(1), 2

Oranje, A., Gorin, J., Jia, Y., & Kerr, D. (2017). Collecting, Analyzing, and Interpreting Response Time, Eye-Tracking, and Log Data. In K. Ercikan & J. W. Pellegrino (Eds.), NCME applications of educational measurement and assessment book series. Validation of score meaning for the next generation of assessments: The use of response processes (pp. 39-51). Routledge

Palinkas, L. A., Horwitz, S. M., Green, C. A., Wisdom, J. P., Duan, N. & Hoagwood, K. (2015). Purposeful Sampling for Qualitative Data Collection and Analysis in Mixed Method Implementation Research. Administration and policy in mental health, 42(5), 533-544. https://doi.org/10.1007/s10488-013-0528-y

Paul, R. & Elder, L. (2005). A Guide for Educators to Critical Thinking Competency Standards, Principles, Performance Indicators, and Outcomes with a Critical Thinking Master Rubric. Foundation for Critical Thinking.

Redecker, C. (2017). European Framework for the Digital Competence of Education: DigCompEdu. Luxembourg Publications Office of the European Union. https://doi.org/ 10.2760/159770

Reichert, F., Zhang, D., Law, N.W.Y., Wong, G.K.W., de la Torre, J. (2020). Exploring the structure of digital literacy competence assessed using authentic software applications. Educational Technology Research and Development, 68, 2991–3013.

Rott, K.J. (2014). Medienkompetenz im Studium: Wie gut ist die Vorbereitung für das spätere Berufsfeld? In O. Zawacki-Richter, D. Kergel, N. Kleinefeld, P. Muckel, J. Stöter & K. Brinkmann (Hrsg.), Teaching Trends 2014. Offen für neue Wege: Digitale Medien in der Hochschule Münster (S.153-169). New York: Waxmann.

Russell, L. B., & Huber, U. (2017). Some Thoughts on Gathering Response Process Validity Evidence: in the Context in Online Measurement and Digital Revolution. In B. D. Zumbo & A. M. Hubley (Eds.), Understanding and Investigating Response Processes in Validation Research (pp. 229-250). Springer International Publishing.

Sá, W. C., West, R. F., & Stanovich, K. E. (1999). The domain specificity and generality of belief bias: Searching for a generalizable critical thinking skill. Journal of Educational Psychology, 91(3), 497–510. https://doi.org/10.1037/0022-0663.91.3.497

Schiefner-Rohs, M. (2012). Verankerung von medienpädagogischer Kompetenz in der universitären Lehrerbildung. In R. Schulz-Zander, R.B. Eickelmann, H. Moser, H. Niesyto & P.Grell (Hrsg.), Jahrbuch Medienpädagogik 9.,(S.359-387). Wiesbaden: Springer.

Schimmel, R. (2011). Recherche im Jurastudium. Bessere Noten mit besseren Suchmaschinen-Strategien. https://www.lto.de/recht/studium-referendariat/s/recherche-im-jurastudium-bessere-noten-mit-besseren-suchmaschinen-strategien/

Schmidt, S., Zlatkin-Troitschanskaia, O., Roeper, J., Klose, V., Weber, M., Bültmann, A.-K., & Brückner, S. (2020). Undergraduate Students' Critical Online Reasoning: Process Mining Analysis. Frontiers in Psychology. Advance online publication. https://doi.org/10.3389/fpsyg.2020.576273

Senkbeil, M. (2018). Development and validation of the ICT motivation scale for young adolescents. Results of the international school assessment study ICILS 2013 in Germany. Learning and Individual Differences, 67, 167-176. https://doi.org/ 10.1016/j.lindif.2018.08.007

Shanahan, M.C. (2008). Transforming information search and evaluation practices of undergraduate students. International Journal of Medical Information, 77(8), 518-526.

Shavelson, R. J., Zlatkin-Troitschanskaia, O., Beck, K., Schmidt, S., & Marino, J. P. (2019). Assessment of University students’ critical thinking: next generation performance assessment. International Journal of Testing, 19(4), 337–362. https://doi.org/ 10.1080/15305058.2018.1543309

Soto, C. J., & John, O. P. (2017). The next Big Five Inventory (BFI-2): Developing and assessing a hierarchical model with 15 facets to enhance bandwidth, fidelity, and predictive power. Journal of Personality and Social Psychology, 113, 117?–?143. https://doi.org/10.1037/pspp0000096

Steffens, Y., Schmitt, I. L., & Aßmann, S. (2017). Mediennutzung Studierender: Über den Umgang mit Medien in hochschulischen Kontexten. Systematisches Review nationaler und internationaler Studien zur Mediennutzung Studierender. https://doi.org/10.13154/rub.106.95

Stoecker, D. (2013). eLearning-Konzept und Drehbuch: Handbuch für Medienautoren und Projektleiter. Springer-Verlag.

Taylor, A., & Dalal, H. A. (2014). Information literacy standards and the World Wide Web: results from a student survey on evaluation of Internet information sources. Information Research, 19(4).

Toplak, M. E., & Stanovich, K. E. (2002). The domain specificity and generality of disjunctive reasoning: Searching for a generalizable critical thinking skill. Journal of Educational Psychology, 94(1), 197–209. https://doi.org/10.1037/0022-0663.94.1.197

Tossell, C. C., Kortum, P., Shepard, C., Rahmati, A., & Zhong, L. (2015). You can lead a horse to water but you cannot make him learn: Smartphone use in higher education. British Journal of Educational Technology, 46(4), 713-724. https://doi.org/10.1111/bjet.12176

Urban, J., & Schweiger, W. (2013). News quality from the recipients’ perspective. Journalism Studies, 15, 821–840. doi: 10.1080/1461670X.2013.856670

Van der Kleij, F. M., Feskens, R. C. & Eggen T. J. (2015). Effects of Feedback in a Computer-based Learning Environment on Students Learning Outcomes: A Meta-Analysis. Review of Educational Research, 85(4), 475–511.

Vasilyeva, E., Puuronen, S., Pechenizkiy, M. & Räsänen, P. (2007). Feedback adaption in web-based learning systems. International journal of continuing Engineering Education and Life Long Learning, 17, 337–357.

Wagner, J. (2018). Legal Tech und Legal Robots. essentials. Springer Fachmedien Wiesbaden. https://doi.org/10.1007/978-3-658-20057-2

Walton, D. (2006a). Fundamentals of Critical Argumentation. Critical Reasoning and Argumentation. Cambridge University Press. https://doi.org/10.1017/CBO9780511807039

Walton, D. (2006b). Rules for Reasoning from Knowledge and Lack of Knowledge. Philosophia, 34, 355 – 376.

Watson, H. & Burr, S. (2018). Research skills in medical education. MedEdPublish, 7(3). https://doi.org/10.15694/mep.2018.0000151.1

Weber, H., Becker, D. & Hillmert, S. (2019). Information-seeking behaviour and academic success in higher education: Which search strategies matter for grade differences among university students and how does this relevance differ by field of study?. Higher Education, 77(4), 657–678. https://doi.org/10.1007/s10734-018-0296-4

Weber, H., Hillmert, S., & Rott, K. (2018). Can digital information literacy among undergraduates be improved? Evidence from an experimental study. Teaching in Higher Education, 23(8), 909-926. https://doi.org/10.1080/13562517.2018.1449740.

White, R., Dumais, S. & Teevan, J. (2009). Characterizing the influence of domain expertise on web search behavior. WSDM '09: Proceedings of the Second ACM International Conference on Web Search and Data Mining, 132-141. https://doi.org/10.1145/1498759.1498819

Wiley, J., Goldman, S. R., Graesser, A. C., Sanchez, C. A., Ash, I. K., & Hemmerich, J. A. (2009). Source Evaluation, Comprehension, and Learning in Internet Science Inquiry Tasks. American Educational Research Journal, 46(4), 1060-1106. https://doi.org/10.3102/0002831209333183

Wineburg, S. & McGrew, S. (2018). Lateral Reading and the Nature of Expertise: Reading Less and Learning More When Evaluating Digital Information. Stanford History Education Group Working Paper, 2017(A1). http://dx.doi.org/10.2139/ssrn.3048994

Wineburg, S., Breakstone, J., McGrew, S. & Ortega, T. (2018). Why google can't save us. The challenges of our post-gutenberg moment. In O. Zlatkin-Troitschanskaia, G. Wittum, & A. Dengel (Eds.), Positive Learning in the Age of Information (pp.221–228). Springer. https://doi.org//10.1007/978-3-658-19567-0_13

Yu, R., Gadiraju, U., Holtz, P., Rokicki, M., Kemkes, P., & Dietze, S. (Eds.) (2018). Predicting User Knowledge Gain in Informational Search Sessions. http://arxiv.org/pdf/1805.00823v1

Zlatkin-Troitschanskaia, O., Hartig, J., Goldhammer, F., & Krstev, J. (2021, in press). Students’ Online Information Use and Learning Progress in Higher Education – A Critical Literature Review. Studies in Higher Education. Special Issue.

Zlatkin-Troitschanskaia, O., Beck, K., Fischer, J., Braunheim, D., Schmidt, S. & Shavelson, R. J. (2020). The role of students’ beliefs when critically reasoning from multiple contradictory sources of information in performance assessments. Frontiers in Education, 11(2192). https://doi.org/10.3389/fpsyg.2020.02192

Zlatkin-Troitschanskaia, O., Shavelson, R. J., Schmidt, S., & Beck, K. (2019). On the complementarity of holistic and analytic approaches to performance assessment scoring. The British Journal of Educational Psychology, 89(3), 468–484. https://doi.org/10.1111/bjep.12286

Zlatkin-Troitschanskaia, O., Toepper, M., Molerov, D., Buske, R., Brückner, S., Pant, H. A., Hofmann, S., & Hansen-Schirra, S. (2018). Adapting and Validating the Collegiate Learning Assessment to Measure Generic Academic Skills of Students in Germany: Implications for International Assessment Studies in Higher Education. In O. Zlatkin-Troitschanskaia, M. Toepper, H. A. Pant, C. Lautenbach, C. Kuhn (Eds.), Assessment of Learning Outcomes in higher education – Cross-National Comparisons and Perspectives (pp. 245-266). Springer.

Zumbo & A. M. Hubley (Eds.), Understanding and Investigating Response Processes in Validation Research. Springer International PublishingSUPPLEMENT