Elena Forzani Headshot

Elena Forzani

Assistant Professor of Literacy Education, Boston University
Chapter Member: Boston SSN
Areas of Expertise:

Connect with Elena

About Elena

Forzani's research focuses on understanding and supporting students' digital literacies practices. In particular, she investigates the cognitive, metacognitive, and motivational dimensions of online reading and how to support students to effectively evaluate the credibility of online information. Additionally, Forzani's research focuses on informing the development of more equity-oriented digital reading assessments. Forzani serves on the NAEP Reading Standing Committee and consults on a number of reading curriculum and assessment projects.

Publications

"Advances and Missed Opportunities in the Development of the 2026 NAEP Reading Framework" (with Peter Afflerbach, Sarah Aguirre, Nancy Brynelson, Gina Cervetti, Byeong-Young Cho, Julie Coiro, Georgia Earnest GarcĂ­a, John T. Guthrie, Kathleen Hinchman, Carol D. Lee, Mariana Pacheco, P. David Pearson, Alicia Ross, Allison Skerrett, and Paola Uccelli). Literacy Research: Theory, Method, and Practice 71, no. 1 (2022): 153-189.

Provides evidence that a small group of Board members aimed to preserve the status quo in reading assessment by downplaying reliance on expertise and authoritative sources of research on reading, learning, and assessment and by removing attention to equity in NAEP Reading. Also discusses both successful (i.e., approved by the Board) and unsuccessful (i.e., rejected by the Board) recommendations for changes to the 2026 Framework that initially were proposed by the DP.

"What Does More and Less Effective Internet Evaluation Entail?: Investigating Readers’ Credibility Judgments Across Content, Source, and Context" (with Julie Corrigan and Carita Kiili). Computers in Human Behavior 135 (2022): 1-16.

Used a CORE framework for critical online resource evaluation to examine information evaluation during online inquiry. 

"Reimagining Literacy Assessment through a New Literacies Lens" (with Julie Corrigan and David Slomp). Journal of Adolescent and Adult Literacy 64, no. 3 (2020): 351-355.

Provides perspectives, questions, and research that enables readers to better advocate for themselves and their students as they develop their own assessment programs and respond to assessment programs that are imposed on them.

"Characteristics and Validity of an Instrument for Assessing Motivations for Online Reading to Learn" (with Donald J. Leu, Eva Yujia Li, Christopher Rhoads, John T. Guthrie, and Betsy McCoach). Reading Research Quarterly 56, no. 4 (2020): 761-780.

Helps establish the MORQ as a well-validated instrument for measuring online reading motivation

"A Three-Tiered Framework for Proactive Critical Evaluation During Online Inquiry" Journal of Adolescent and Adult Literacy 63, no. 4 (2020): 401-414.

Proposes a framework that positions readers as proactive judges engaging in iterative evaluation of relevancy and credibility within and across three tiers (content, source, and context) and multiple texts while researching a topic of interest during online research and comprehension. 

"How Well Can Students Evaluate Online Science Information? Contributions of Prior Knowledge, Gender, Socioeconomic Status, and Offline Reading Ability" Reading Research Quarterly 53 (2018): 385-390.

Finds that evaluation is a unique and difficult dimension of online research and comprehension. Also finds that girls outperform boys and that students with greater prior knowledge and offline reading ability can better evaluate online information compared with those with less prior knowledge and offline reading ability