Evaluation

How are learning impacts evaluated in public libraries? What is currently known about the nature of STEM learning in public libraries?  What critical factors that lead to rich and effective STEM learning experiences in libraries? Review findings from leaders in evaluating public library programs, then join the ongoing discussion!

Presentations

The 2015 Public Libraries & STEM conference set out to, in part, develop the foundation for a future evaluation and research agenda that examines STEM learning in public libraries.

William R. Penuel (2015)
Evaluating STEM Programs in Public Institutions in Communities: Focusing on Equity 
University of Colorado

Download PDF

Broadening participation in science, technology, engineering, and mathematics (STEM) is a priority for educational organizations at the local, state, and national levels. Broadening participation means more than preparing future STEM professionals. It also means preparing people to draw upon STEM knowledge and practices for civic engagement and to address the needs of their communities. Evaluation of such programs must attend to the broad range of goals for STEM participation, document equity of opportunity to participate in STEM-related activities across settings, and support program improvement. In this presentation, I will begin by providing a framework for evaluating a diversity of outcomes of STEM programs in public institutions and for documenting equity in student opportunities to learn. I will illustrate how we are applying this framework to the study of STEM learning in different out of school settings that are focused on promoting interest-related learning. In the concluding part of my presentation, I will describe how evaluation can support continuous improvement of STEM programs in libraries, helping institutions to provide more equitable opportunities for STEM learning that lead to valued outcomes for participants and communities.

Buxner S. R., Jaksha A., LaConte K. M. (2013)
Bringing STEM to Libraries through the Explore Program: Findings from a Follow-Up Survey
Planetary Science Institute, New York Hall of Science, Lunar and Planetary Institute

*Presented at the 2015 Public Libraries & STEM Conference. View Conference Program

Download Poster

We report on a follow-up study of library staff who had attended in-person and/or online professional training to facilitate hands-on Earth and space science learning experiences through the Lunar and Planetary Institute’s Explore program. End of workshop evaluations consistently demonstrate that participants leave with increased content knowledge and intent to use what was modeled at the workshop. Our survey was completed by 183 individuals who had completed training from 1998–2013. The survey asked respondents to share information about their institution and responsibilities, their use of materials or reasons they did not use the materials, and to share barriers to program implementations, partnerships they had created and successes they had in program implementation. Findings show that participation in the Explore program resulted in changes in participants’ beliefs and behaviors around leading science programming at their institutions. Sixty percent of the respondents reported that they were very committed to providing science and engineering experiences for their visitors, compared to 16% before the training, and over 75% reported that they were more likely to advocate for including science and engineering in the programs offered at their facilities. Respondents reported gains in the skills, knowledge, confidence, and a virtual network of support to bring STEM programming to their communities and reported ways in which they integrated it into their existing programs.

Fitzhugh G., Coulon V. (2015)
Can Libraries Provide STEM Learning Experiences for Patrons? Findings from the STAR_Net Project Summative Evaluation
Education Development Center, Inc.

*Presented at the 2015 Public Libraries & STEM Conference. View Conference Program

Download Poster

We will highlight evaluation findings from the first phase of STAR_Net (the Science, Technology, Activities and Resources Library Education Network). STAR_Net, was developed by the National Center for Interactive Learning and its partners with funding from the National Science Foundation. STAR_Net brought inquiry-based STEM learning experiences to 18 public libraries through two traveling exhibits, associated programming for library patrons, and a virtual community of practice for library staff and others interested in bringing STEM programming to libraries. Evaluation and Research Associates (now part of Education Development Center) evaluated the implementation of the project and its impact on library staff and patrons. The evaluation found that the project had a positive impact on participating librarians, library staff, and library patrons. Librarians and library staff reported that the project increased their knowledge, interest, and confidence in offering STEM programming in their libraries. The exhibits appeared to spark the interest of many patrons to learn more about science and engineering. Many libraries reached out and developed connections with organizations and individuals they had not worked with previously. The majority of libraries that hosted the exhibit reported that they planned to continue to offer STEM programming. Data suggests that the project may have a lasting impact on some libraries’ interest in and capacity to educate their patrons about science.

Teasdale R. M., Grack Nelson A. (2015)
Evaluation of Library STEM Programs: Learning from the BISE Project
University of Illinois at Urbana-Champaign, Science Museum of Minnesota

*Presented at the 2015 Public Libraries & STEM Conference. View Conference Program

Download Poster

Library staff members face an increasing call to demonstrate the impact of STEM services in public libraries, yet few librarians are familiar with the designs, methods, and measure that can be used to evaluate informal science education (ISE) projects. This poster first introduces a resource to assist librarians in developing our understanding of ISE evaluation and then shares preliminary findings of a study on the outcomes that ISE evaluators currently examine and the methods they use to study those outcomes. The resource to be shared comes from the Building Informal Science Education (BISE) project that was funded by the National Science Foundation in 2010. The BISE project created a database of ISE evaluation reports and developed metadata codes to facilitate searching and analysis of those reports. Librarians can use the BISE database and coding scheme to learn how evaluation is conducted in ISE contexts beyond libraries including museums, after-school programs, and broadcast media. This poster then reports preliminary findings of a study currently underway using the evaluation reports in the BISE database. This investigation examines the reports to understand (a) the types of outcomes that evaluators of ISE projects study and (b) the designs, methods and measures that are used to examine those outcomes. Librarians can use these findings to inform our thinking as we develop approaches for evaluating the impact of STEM learning in public libraries.

Validated Evaluation Instruments

Katie Van Horne, Bill Penuel, Vera Michalchik
Measures of Connected Learning
DML Research Tools

Visit Web Page

Charles Smith, Executive Director
Youth Program Quality Assessment (PQA)®
David P. Weikart Center for Youth Program Quality

Visit Web Page

James Bell, Project Director
Research and Evaluation Instruments
Center for Advancement of Informal Science Education (CAISE), District Of Columbia, United States

Visit Web Page

Noyce Foundation & PEAR
Assessment Tools in Informal Science
Mclean Hospital and Harvard Medical School

Visit Web Page

The STELAR Center
STEM Learning and Research Instruments
Education Development Center, Waltham, MA

Visit Web Page

Cornell Lab of Ornithology
Developing, Validating, and Implementing Situated Evaluation Instruments (DEVISE)
Cornell University

Visit Web Page

BISE Project Team
Building Informal Science Education (BISE)
Visitor Studies Association (VSA)

Visit Web Page

Rena Dorph, Director
The Learning Activation Lab
Regents of the University of California

Visit Web Page

Edys Quellmalz, Project Director
Online Evaluation Resource Library (OERL)
Division of Research, Evaluation and Communication, Directorate for Education and Human Resources, National Science Foundation

Visit Web Page

Karen Pittman, Co-Founder, President and CEO
From Soft Skills to Hard Data: Measuring Youth Program Outcomes
The Forum for Youth Investment

Visit Web Page

Publications

Kate Haley Goldman (2011)
STAR_Net Front-end Evaluation Report 
National Center for Interactive Learning, Space Science Institute

Download PDF

Ginger Fitzhugh, Julie Elworth, Vicky Ragan Coulon (2013)
STAR_Net Summative Evaluation Report
Evaluation & Research Associates

Download PDF

Hussar K, Schwartz S, Boiselle E, Noam GG (2008)
Toward a systematic evidence-base for science in out-of-school time: the role of assessment
Noyce Foundation, Cambridge, MA

Download PDF


NCIL, in partnership with the Lunar and Planetary Institute, received funding from the National Science Foundation for the first-ever Public Libraries & STEM conference that took place at the Sheraton Denver Downtown Hotel in Colorado, August 20-22, 2015. This invitation-only conference brought 150+ library and STEM professionals and funders together to build productive relationships; explore promising practices in designing effective programs; help define a 21st century vision of STEM learning in public libraries; and develop the foundation for a future evaluation and research agenda for libraries and their partners engaged in STEM education efforts. The conference background reports, presentation files, and results were used as the foundation of the resources compiled For more information, download the following documents.

Conference Summary | Conference Evaluation Report | Public Libraries and STEM

This material is based upon work supported by the National Science Foundation under Grant Numbers DRL-1413783 and DRL-1421427. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.