Innovations in STEM Education Evaluation
Innovations in STEM Education Evaluation
 
Vancouver News, April 13, 2012
Print

Session title:

Innovations in STEM Education Evaluation: Enhancing the Quality of Evidence Available to Assess the Impacts of Research Portfolios

Experiences in Scaling Up

An early morning Friday session (29.014) on the experiences of four education researchers in taking innovations to scale drew 200 interested persons.  Each of the programs required that the researcher to be directly involved in schools -- no ivory tower research here.
 
The four programs included one small program seeking to grow to serve more students, one that had grown already to serve 1000 schools, one that had failed in its efforts to grow beyond the demonstration stage, and one that was in development as a model for giving schools enhanced capacity to change. 
 
The researchers shared lessons from their efforts, the most important: being clarity of purpose about the intended audience and about the central aspects of the intervention.  Other scale-up suggestions:  anticipate capacity issues in the implementation of innovation; secure and maintain system buy-in for the innovation.
 
Robert Slavin, creater of Success for All, added some keys to success of his program:  stay non-profit; avoid dependence on grants for opperations; and be attentive to local needs.

Summary:
Researchers presented insights from three recent National Science Foundation initiatives to advance innovations in science, technology, education, and math (STEM) education evaluation and to enhance the quality of evidence available to assess the impacts of portfolios of research. These measures are intended to inform strategic planning across the educational research, planning, policy-making, and professional communities by providing accountability and indicators of the impact of funded research. The presenters considered challenges encountered in devising appropriate metrics and approaches for undertaking such evaluations, particularly when the portfolios incorporate diverse and innovative investigations at various stages in the research and development cycle.

Time: Friday, April 13, 2:15 p.m.–3:45 p.m. 

Building/Room: Marriott Pinnacle, Third Level - Pinnacle III

Session Participants:

Chair: Finbarr C. Sloane (Arizona State University)

Using Bibliometric Data for Program Evaluation: A Citation Analysis of the National Science Foundation’s Program on Research and Evaluation on Education in Science and Engineering
James S. Dietz (National Science Foundation), Kevin L. Brown (NORC at the University of Chicago)

 
       Barbara L. Schneider
Developing a Rubric for Assessing the Methodological Rigor of Research Portfolios: Insights From a Pilot Project 

Barbara L. Schneider (Michigan State University), Sarah-Kathryn McDonald (NORC at the University of Chicago)

Advancing Innovations in STEM Education Evaluation: Program, Portfolio, and Project Evaluation Initiatives in the National Science Foundation’s Education and Human Resources Directorate 
Janice M. Earle (National Science Foundation)

Discussant: Finbarr C. Sloane (Arizona State University)

Discussant: Stephen J. Hegedus (University of Massachusetts)

 

          Stephen J. Hegedus
 
Janice M. Earle
    
                Finbarr C. Sloane