Date of Award


Degree Type

Restricted to Claremont Colleges Dissertation

Degree Name

Education, PhD


School of Educational Studies

Dissertation or Thesis Committee Member

Gwen Garrison

Dissertation or Thesis Committee Member

June Hilton

Dissertation or Thesis Committee Member

David Drew

Terms of Use & License Information

Creative Commons Attribution-No Derivative Works 4.0 License
This work is licensed under a Creative Commons Attribution-No Derivative Works 4.0 License.

Rights Information

© 2024 Cleon M. McLean


cognitive load theory, extraneous cognitive load, SBAC findings, SBAC recommendations, schema, working memory

Subject Categories

Educational Assessment, Evaluation, and Research


This non-consecutive, three-year qualitative study explored the testing experiences of high school juniors who used Chromebooks to take their Smarter Balanced Standardized Assessments (SBAC) in English Language Arts and Mathematics. The study began in 2015, the first year in which SBAC was administered to students across California and other states, at four different settings in one local unified high school district. The study was then paused for two years to allow for fuller implementation of SBAC. Then, in 2017, the study resumed at the original research settings plus one more with the goal of capturing any significant qualitative data insights into a new group of participants’ SBAC experiences. The study was then paused for a second time to allow for some preliminary data analyses and reflection. Finally, in 2022, the study resumed after adopting and then practicing COVID protocols with a third group of participants at three of the previous five research settings. These punctuated data-collection periods afforded comparative analyses of students’ computer-mediated test-taking experiences, which were phenomena not documented in the known literature about the initial year of SBAC. As such, the singular research question of this study, i.e., what were high school juniors’ experiences while taking Smarter Balanced Summative Assessments? spurred phenomenological explorations via focus group and individual interviews of a total of 90 participants in-person and via Zoom platform. Subsequently, using MAXQDA Analytics Pro 2024 software, analyses of the three years of data resulted in four critical findings: (1.) Poor incentive structures and unclear reasons why students needed to take SBAC contributed to participants’ lackluster motivation to perform to the best of their abilities on SBAC; (2.) Problematic SBAC content language contributed to students’ frustrations and extraneous cognitive load; (3.) Limited formal computer literacy skills stunted students’ abilities to understand and manipulate SBAC’s logics of semantic and visual discourse designs; and (4.) Time-sensitive affordances of some SBAC tools and design features restricted test-takers from keeping virtual records of their participations within sections and between sections of the same test. These findings signaled five limitations of this study and yielded five recommendations for practice, especially concerning optimal instructional designs tailored for standardized testing preparatory content to be delivered via instructional technology, and the advantages of doing so for student motivation and performance.