St. Jude Family of Websites
Explore our cutting edge research, world-class patient care, career opportunities and more.
St. Jude Children's Research Hospital Home
St. Jude Family of Websites
Explore our cutting edge research, world-class patient care, career opportunities and more.
St. Jude Children's Research Hospital Home
Famed physicist Albert Einstein once said, “Not everything that can be counted counts, and not everything that counts can be counted.” In scientific research, numbers can tell us what is happening, but they often fall short of explaining the context of why or how people experience what occurs.
The STEMM Education and Outreach program at St. Jude focuses on fostering interest in science, technology, engineering, mathematics and medicine (STEMM) subjects and careers by helping students see themselves as scientists. Much of this work focuses on elements that defy quantification, such as feelings, perceptions, motivations and other human emotions and experiences.
Qualitative research explores the complexity of human experiences by bridging disciplines and blending scientific rigor with human insights.
“Qualitative research enables us to tell the human stories behind the numbers in a way that data alone cannot,” said Robyn Pennella, MPH, manager of the program evaluation team. The team is responsible for evaluating and assessing programs for the Cancer Research, Training, Education & Coordination (CRTEC) efforts at St. Jude, including those within the STEMM Education & Outreach Program. They develop comprehensive reports that offer valuable insights into program effectiveness and impact.
These reports help stakeholders understand the impact of their initiatives, make informed decisions and guide program planning. By supporting program planning and fostering continuous improvement, the team optimizes programs to achieve their intended goals.
They utilize techniques that focus on exploring student experiences, personal perspectives, and the social, cultural and emotional factors that influence behaviors and decisions.
They dive into the why and how behind the numbers.
Quantitative metrics, such as enrollment rates and self-efficacy surveys, are undeniably important when analyzing a program’s success. Such numbers offer measurable indicators of progress and help identify trends. But these data alone cannot capture the complex human stories that shape educational journeys, career choices and interest within STEMM fields.
To address this, the program evaluation team uses a mixed methods approach, combining both qualitative and quantitative methods to evaluate and assess programs to gain a more comprehensive understanding of program outcomes. “The qualitative research helps enrich and substantiate the findings from quantitative data, allowing us to uncover more comprehensive insights and context,” explained Pennella.
The qualitative side of this approach collects detailed insights through methods such as one-on-one interviews and open-ended survey responses. The conversational nature of the interviews allows the interviewer to follow the participant’s lead, enabling deeper and more meaningful discussions. Surveys are used to measure program impact, often comparing participants’ perceptions before and after their experience to assess changes over time. These methods allow participants to share their personal experiences, challenges and growth in their own words, offering context that complements the numerical data.
Once completed, the interviews and surveys are analyzed using a critical and reflective approach to explore key areas such as program experience, science identity and other related factors. “Our research lets us go beyond traditional program evaluation metrics. It helps us understand how a program shapes a trainee’s development, whether it’s deepening a student’s understanding of science, influencing their career path or helping them grow into more compassionate scientists,” said Pennella.
On the quantitative side, the team utilizes metrics including enrollment rates and pre- and post-program assessment scores to provide objective measures of success. By leveraging robust statistical methods, the program evaluation team ensures these measures are not only reliable but also meaningful.
By using assessments of student progress, the team can identify areas where a program’s curriculum is most effective while also uncovering gaps in student understanding of specific concepts. These insights enable the team to recommend targeted adjustments to refine and enhance the curriculum further.
“In a pilot study of the program, the Virtual STEMM Academy conducted a pre-test on topics related to the cell cycle and blood,” explained Eric Rivera-Peraza, MS, a data analyst on the program evaluation team. “The same test questions were asked after the three-week curriculum, and our team performed a paired t-test to compare pre-test and post-test scores, which provided a statistical assessment of student progress throughout the program.”
By combining rigorous statistical analysis with a focus on actionable insights, the program evaluation team uses quantitative methods as a cornerstone of program evaluation, helping stakeholders make informed decisions and continuously refine their initiatives.
The impact of the research conducted by the program evaluation team extends beyond the STEMM Education and Outreach Program. Leveraging their expertise in evaluation methodologies, they have successfully applied their skills to assess and improve a variety of other programs across St. Jude.
“We apply these same methods for STEMM programs, but we’ve helped evaluate many programs across St. Jude. Most recently, we’ve expanded to support more CRTEC efforts in the Cancer Center,” said Pennella.
The program evaluation team has redefined how program success is measured. While traditional metrics such as numbers served or products produced remain important, qualitative research enables the exploration of deeper impacts. By moving beyond conventional measures, the team uncovers the transformative effects of these programs.