AERA WS
Login
|
Join / Renew
|
Job Board
|
My Cart
|
Contact Us
For:
Graduate Students
|
Divisions
|
SIGs
|
AERA-CURI
About
Events
Policy
Education
Professional
Publications
Membership
Newsroom
SIG174
»
Annual Meeting
»
2022 Annual Meeting
About
Accreditation, Assessment, and Program Evaluation Research in Educator Preparation SIG 174
Membership
Annual Meeting
2025 Annual Meeting
2024 Annual Meeting
2023 Annual Meeting
2022 Annual Meeting
AAPE Awards
2025 AAPE Award Winners
2025 Awards Committee
2024 AAPE Award Winners
2023 Conference Registration Sponsorships
2023 Distinguished Conference Paper Award Winners
AAPE Publications
SIG Involvement Opportunities
2022 Annual Meeting
Share
Sat, April 23, 11:30am to 1:00pm PDT (2:30 to 4:00pm EDT), Manchester Grand Hyatt, Floor: 2nd Level, Harbor Tower, Balboa A
When thinking about inequities in education, it is important to consider how assessments are designed and administered. The first paper describes a longitudinal case study on program completer impact on P-12 student learning. The second paper examines survey design and whether an even number of indicators in a Likert scale is preferable to an odd number of indicators. The third paper examines online test proctoring outcomes from eleven professional licensure exams as compared to in-person administration.
Chair: Barbara L. Bales, University of Wisconsin
Discussant: Mehmet “Dali” Ozturk
#1-Measuring Completer Impact in Non-Data-Sharing States: A Multi-Year Case Study Design and Findings
F. Todd Goodson, Kansas State University
Eileen Wertzberger, Kansas State University
Abstract: Session describes a longitudinal case study design employing focus group methodologies to assess completer impact in P-12 classrooms. Findings to date indicate a rich set of factors used both by new teachers and by employers to assess impact on student learning. These data are arguably a more nuanced view of teacher impact than afforded by state assessment data. Certainly, these results are uniquely capable of driving continuous improvement at the program level.
#2-An Examination of Instrument Design Requirement for Educator Preparation Programs
Jianjun Wang, California State University – Bakersfield
Abstract: The Council for the Accreditation of Educator Preparation (CAEP) requires data collection to track program quality across Educator Preparation Providers. In its assessment rubric, CAEP advocates an even number of choices to eliminate neutral responses. To examine potential issues in assessment outcomes, survey data are gathered in this study from a unit operation survey that was endorsed by a national accreditation team in the past. Based on a theoretical framework of hypothesis testing that models the dichotomy of odd- vs. even- number of choices, the result did not show negative evidence against the use of a Likert-type scale with a midpoint option for neutral responses.
#3-Comparing Outcomes From Examinations Proctored In Test Centres and Online Using Live Remote Proctoring Technology
Gemma Melissa Cherry, Dublin City University
Michael O'Leary, Dublin City University
Linda Waters, Prometric
Li-Ann Kuan, Prometric
Abstract: The onset of the COVID-19 pandemic brought with it the closure of many bricks-and-mortar testing centres and a concomitant move to the remote (online) proctoring of many credentialing exams that were scheduled to take place throughout 2020. In response to a dearth of research in the area, the study described in this paper uses data from eleven professional licensure examinations taken by 14,097 candidates across four US States to compare outcomes for tests proctored either in Test Centres or remotely in real-time using Live Remote Proctoring software. Candidate outcomes were compared using average percent correct and passing rates. Test psychometric properties were compared using measures of reliability, decision consistency, item difficulty, item discrimination and time taken to complete tests.
Designed by
Weber-Shandwick
Powered by
eNOAH
Loading...
{1}
##LOC[OK]##
{1}
##LOC[OK]##
##LOC[Cancel]##
{1}
##LOC[OK]##
##LOC[Cancel]##