Date of Award
Summer 2024
Degree Type
Open Access Dissertation
Degree Name
Psychology, PhD
Program
School of Social Science, Politics, and Evaluation
Advisor/Supervisor/Committee Chair
Bianca Montrosse-Moorhead
Dissertation or Thesis Committee Member
Tarek Azzam
Dissertation or Thesis Committee Member
Stewart Donaldson
Dissertation or Thesis Committee Member
Jason Siegel
Terms of Use & License Information
This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 4.0 License.
Rights Information
© 2024 Natalie D Jones
Keywords
data visualization, data viz, mixed methods, mixed methods research, program evaluation, research on evaluation
Subject Categories
Psychology
Abstract
The use of Mixed Methods (MM) in evaluation has increasingly become a valuable asset in evaluators' methodological toolbox. Despite prolific research on various aspects of mixed methods, from typologies to ontological debates, there remains a dearth of empirically derived evidence on effectively communicating MM findings. This lack of empirical research is compounded by criticism regarding the integration in MM studies. A prominent means of addressing both integration of data strands and communicating findings in MM studies is through joint displays.This study presents a multiphase exploration into how evaluators communicate MM findings and the visual efficiency of joint displays compared to traditional separate displays. Phase I involved reviewing a sample of professional evaluation studies (n = 30) accessed through the National Science Foundation (NSF) Informal Science Education (ISE) program database to examine the communication of MM findings. The review revealed the limited use of joint displays beyond quantitized qualitative data displays, with tables being the most commonly used method for visualizing quantitative data and in-text quotes for qualitative data. Phase 1 results were utilized to develop and refine materials for the scenario-based study aimed at comparing the visual efficiency of traditional and joint displays. Using a 2 (traditional vs. joint display) X 2 (positive vs. neutral valence) design, participants (n = 232) were randomly assigned to one of four conditions and viewed a description of a fictitious evaluation program along with an excerpt of the program report detailing two of its outcomes. While the positively valenced joint display demonstrated higher visual efficiency, perceived program effectiveness, credibility, and attitudes towards the program compared to other conditions, valence appeared to play a more significant role, influencing visual efficiency and other outcome variables. Qualitative analysis provided insights into participant recommendations and perceptions of the evaluator. Recommendations for future program funding were influenced by valence, with participants offering alternative reasoning for neutral findings. Participants reported relying equally on qualitative and quantitative data to assess program effectiveness. Phase 3 aimed to further investigate the differences in visual efficiency and credibility across various joint displays. Participants (n = 336) were randomly assigned to one of six conditions representing a 3 (Design) X 2 (Valence) design, comparing different styles of joint displays. Valence continued to significantly influence visual efficiency, with positively valenced displays showing greater efficiency, particularly in table and exemplar displays. Valence also affected outcome variables such as perceived credibility and aesthetics, with positively valenced conditions consistently yielding more favorable results. The type of display did not significantly impact participants' perceptions, as respondents relied equally on both qualitative and quantitative data to assess program effectiveness. However, participants in positively valenced conditions tended to recommend increased program funding and retaining the same evaluator, while those in neutral conditions were more likely to suggest decreased funding and a different evaluator. These findings undergird the need for the application of data visualization principles in mixed methods research and the communication of evaluation findings. By prioritizing audience understanding through tailored visualizations, this study aims to enhance understanding of stakeholder perceptions and how to effectively communicate mixed methods evaluation findings. Further, the findings reveal significant implications for evaluation theory and practice, specifically concerning the impact of evaluation findings valence (i.e., positive program results v. neutral program results) on stakeholder perceptions and the necessity of addressing misunderstandings about evaluation methodologies. Emphasizing the human element in communication, the research underscores the ongoing importance of effective communication strategies in conveying evaluation findings and building trust in evaluation processes.
ISBN
9798383699775
Recommended Citation
Jones, Natalie D.. (2024). Where Numbers Meet Narratives: An Examination of Joint Displays in Mixed Methods Evaluation. CGU Theses & Dissertations, 825. https://scholarship.claremont.edu/cgu_etd/825.