Professional Development Courses
 
Professional Development and Training Courses at the 2018 Annual Meeting
Print
 

Professional Development and Training
Courses for 2018 Annual Meeting
Courses Start April 12, 2018


The AERA Professional Development and Training Committee is pleased to offer a robust program of courses for the 2018 AERA Annual Meeting in New York. The program was crafted based on consideration of more than 50 submissions and a competitive peer review process. Courses are designed at various levels (e.g., basic, intermediate, advanced) to reach graduate students, early career scholars, and other researchers who seek to increase their knowledge and enhance research skills.


Jump to the course list

Course Dates: 

  • The extended courses are held on Thursday, April 12, or on Friday, April 13.
  • The mini-courses will occur Friday through Monday, April 13–16.

Registration: 

  • Potential participants can register for Professional Development and Training Courses by logging into and updating their 2018 AERA Annual Meeting registration.
  • You must be registered for the Annual Meeting in order to purchase tickets to attend courses.

Materials: 

  • Course participants should bring a laptop with any software suggested or specified in the course description.

Questions: 

  • Direct questions about the professional development and training courses to profdevel@aera.net.

Click a course number to learn more.

Extended Courses
  • PDC01: Analyzing Data From International Large-Scale Assessment Using R

  • PDC02: Creative Strategies for Mixing Qualitative and Quantitative Approaches in Mixed Methods Research

  • PDC03: Data from the Early Childhood Longitudinal Study, Kindergarten Class of 2010–11 (ECLS-K:2011) at NCES

  • PDC04: Introduction to Systematic Review and Meta-Analysis

  • PDC05: Multilevel Modeling With Large-Scale International Databases Using the HLM Software Program

  • PDC06: Autoethnography: The Necessity of Bringing the Self Into Public Education

  • PDC07: Understanding and Implementing Power Analysis for Treatment-Control Studies

  • PDC08: An Interactive and Case-Centered Primer on Evaluation Theories and Approaches

  • PDC09: Advanced Meta-Analysis

  • PDC10: Designing Surveys for Education Research

  • PDC11: Exploring a Data-Informed Approach to the Development of Students’ Social-Emotional Competencies

  • PDC12: Writing Persuasive Mixed Methods Research Proposals
  • PDC13: Analyzing NAEP Data Using R

  • PDC14: How to Get Published: Guidance From Emerging and Established Scholars

Mini Courses
  • PDC15: Explaining Growth Models and Their Use in Accountability Systems for States, Districts, and Education Researchers

  • PDC16: Using NAEP Data on the Web for Educational Policy Research

  • PDC17: Writing an Application for an IES Research Grant

  • PDC18: Advanced Analysis Using Adult International Large-Scale Assessment Databases: PIAAC

  • PDC19: Using Empowerment Evaluation with Evaluation Worksheets from Google to Build Evaluation Capacity and Learning Communities

  • PDC20: Improving the Quality of Research Measures: Lessons from the Standards for Educational and Psychological Testing

  • PDC21: What Would it Take to Change Your Inference? Quantifying the Discourse about Causal Inferences in the Social Sciences

  • PDC22: Advanced Analysis Using School-Based International Large-Scale Assessment Databases: TIMSS, PIRLS, and PISA

  • PDC23: Evaluating for Improvement: Assessing the Implementation and Effectiveness of Strategies for Change Within a Continuous Improvement Cycle

  • PDC24: Strategies for Sharing Your Research

  • PDC25: Systematic Review: Tool for a Scholarly Critique of Literature

  • PDC26: Data Sharing and Research Transparency at the Article Publishing Stage

  • PDC27: Designing Adequately Powered Cluster Randomized Trials to Detect Main Effects, Moderation, and Mediation
  • PDC28: Mixed Methods Data from the Stanford Civic Purpose Project

  • PDC29: Introduction to Infographics and Data Visualization

  • PDC30: Introduction to Quantile Regression for Education Research

  • PDC31: Student Privacy 101 for Researchers


Extended Courses

PDC01: Analyzing Data From International Large-Scale Assessment Using R

Instructors: Emmanuel Sikali, U.S. Department of Eduation; Paul Bailey, American Institutes for Research; Lauren Harrell, National Center for Education Statistics; Stephen Provasnik, National Center for Education Statistics; Ting Zhang, American Institutes for Research
Date: Thursday, April 12
Time: 9 a.m. – 5 p.m.
Location: Sheraton New York Times Square, Lower Level - Union Square
Fee: $135

This course will introduce participants to the procedures of analyzing large-scale international data. Public-use TIMSS data files will be used as example data sets. Participants will begin installing the R EdSurvey package and importing the data files into R. They will learn how to manipulate the data, including merging, subsetting, and recoding data, and will follow the instructions in a handout to estimate the descriptive statistics of plausible values and variables of interest in an international studies data file. The participants will learn how to use the R package to perform the statistical techniques used most often in international studies data analyses, including selecting an appropriate sample, estimating the mean scale scores for groups of students, benchmark analysis, gap analysis, regression, and correlations. There will be designated time for participants using R to practice the techniques with the variables of their own interest. Participants will have the opportunity to ask and have questions answered during all sessions of the training class, including lectures.

 


PDC02: Creative Strategies for Mixing Qualitative and Quantitative Approaches in Mixed Methods Research

Instructors: Elizabeth G. Creamer, Virginia Polytechnic Institute and State University; Cherie D. Edwards, Virginia Polytechnic Institute and State University
Date: Thursday, April 12
Time: 9 a.m. – 5 p.m. 
Location: Sheraton New York Times Square, Lower Level – Murray Hill
Fee: $135

This interactive, problem-centered course introduces ideas for creative ways to mix qualitative and quantitative approaches in a mixed methods research project. This course is appropriate for graduate students and early career researchers with little or no prior knowledge of mixed methods research practice. The principal purpose of the course is to use a problem-oriented activity to generate creative ideas about ways to integrate qualitative and quantitative data during analysis. Objectives for the course include to (a) review the distinctions between multi-method, mixed method, and fully integrated mixed method research; (b) distinguish major reasons for using a mixed method approach; (c) present dialectical pluralism as the paradigmatic grounding for mixed method research; and (d) describe strategies for mixing at all phases of the research process, including at the design and sampling stage, and analysis. Participants completing the course will come away with ideas about ways to design a mixed methods research study. A pre-course assignment is to read “A Primer About Mixed Methods Approaches for Research in an Educational Context,” available at  https://mixedmethodresearch.wordpress.com/additional-resources-by-e-g-creamer/.



PDC03: Data from the Early Childhood Longitudinal Study, Kindergarten Class of 2010–11 (ECLS-K:2011) at NCES 

 

Instructors: Shannon Russell, American Institutes for Research; Gail M. Mulligan, National Center for Education Statistics, U.S. Department of Education; Jill McCarroll, U.S. Department of Education; Cameron McPhee, American Institutes for Research
Date: Thursday, April 12
Time: 9 a.m. – 5 p.m. 
Location: Sheraton New York Times Square, Lower Level - Flatiron
Fee: $135

Researchers can use information from the Early Childhood Longitudinal Study, Kindergarten Class of 2010–11 (ECLS-K:2011) data set to examine the relationships between a wide range of family, school, classroom, and individual characteristics and children’s development, learning, and school performance in kindergarten and across the elementary school years. In this course, participants will become familiar with the design, content, and research utility of the ECLS-K:2011; learn how to use the Electronic Codebook (ECB) software to create individualized data sets for personal research; understand the need for using sample weights and adjusting variance estimates to conduct accurate analyses; and become familiar with resources available from NCES related to the ECLS-K:2011. The training will focus on the publicly released data from the kindergarten through fourth-grade data collections. This course is primarily lecture based, and there will be time for participants to practice navigating the ECB software, work through hands-on exercises, and ask questions about how the ECLS data can best be used to address their own research interests.


PDC04: Introduction to Systematic Review and Meta-Analysis 

Instructors: Amy L. Dent, University of California, Irvine; Terri D. Pigott, Loyola University Chicago; Joshua R. Polanin, Development Services Group; Joseph Taylor, BSCS
Date: Thursday, April 12
Time: 9 a.m. – 5 p.m. 
Location: Sheraton New York Times Square, Lower Level - Bowery
Fee: $135
This course is also available via live webcast and from the AERA Virtual Research Learning Center (VRLC).

This course will introduce the basics of systematic review and meta-analysis. Topics covered include developing a research question, searching the literature, evaluating and coding studies, conducting a meta-analysis, and interpreting results for various stakeholders. Participants are encouraged to bring an idea for a systematic review to the course, with time reserved for discussion about it with course instructors. Course activities will include lecture, hands-on exercises, small-group discussion, and individual consultation. The target audience includes both those new to systematic review and meta-analysis and those currently conducting either type of project. Knowledge of basic descriptive statistics is assumed.


PDC05: Multilevel Modeling With Large-Scale International Databases Using the HLM Software Program

Instructors: David C. Miller, American Institutes for Research; Francis Howard Lim Huang, University of Missouri; Sakiko Ikoma, American Institutes for Research; Sabine Meinck, IEA Hamburg; Bitnara Jasmine Park, American Institutes for Research; Yuan Zhang, American Institutes for Research
Date: Thursday, April 12
Time: 9 a.m. – 5 p.m. 
Location: Sheraton New York Times Square, Lower Level - Chelsea
Fee: $135

Data from international large-scale assessments (ILSAs) reflect the nested structure of education systems and are, therefore, very well suited for multilevel modeling (MLM). However, because these data come from complex cluster samples, there are methodological aspects that a researcher needs to understand when doing MLM, e.g., the need for using sampling weights and multiple achievement values for parameter estimation. This course will teach participants how to do MLM with data from ILSAs, such as PIRLS, TIMSS, and PISA. The content of the course will include an overview of the ILSAs and a presentation on the design of these studies and databases and implications for MLM analysis. Participants will learn how to specify two-level models using the HLM 7 software program and also learn about model comparison, centering decisions and their consequences, and available resources for doing three-level models. Time will be allotted for participants to work on practice exercises, with several instructors available to mentor and answer questions. Participants should have a solid understanding of OLS regression and a basic understanding of multilevel models. Prior experience using a statistical software program, such as Stata or SPSS, is helpful. Prior knowledge about ILSAs or prior experience using the respective databases or HLM software is not required. To fully participate in the hands-on demonstrations and example analyses, participants should bring their own laptops with HLM software (a free student version is available).


PDC06: Autoethnography: The Necessity of Bringing the Self Into Public Education

Instructors: Julie L. Pennington, University of Nevada, Reno; Sherick A. Hughes, University of North Carolina, Chapel Hill
Date: Thursday, April 12
Time: 9 a.m. – 5 p.m. 
Location: Sheraton New York Times Square, Lower Level - Gramercy
Fee: $135

The purpose of this course is to define autoethnographic research methods for qualitative researchers seeking a methodological alternative from objectivist orientations to research in public education. Autoethnography requires researchers to place themselves as the object of their research, and it allows educational researchers to unravel often-concealed perspectives in order to move researchers and educators to action on behalf of educational equity issues and concerns. Participants should have an interest in obtaining or currently be in a position that involves engaging autoethnography in their teaching and research. Target course participants include graduate students and current researchers interested in autoethnographic methodology. Information related to understanding the history of autoethnographic work, as well designing, completing, and representing autoethnographic research in ways that push for justice and equity in public education, will be provided. Participants will leave the course with information to begin the process of adding autoethnography as a tool in their methodological toolkits, with a specific focus on establishing a community of critical colleagues in the course.


PDC07: Understanding and Implementing Power Analysis for Treatment-Control Studies

Instructors: Eric Hedberg, University of Chicago
Date: Thursday, April 12
Time: 9 a.m. – 5 p.m. 
Location: Sheraton New York Times Square, Lower Level - Sugar Hill
Fee: $135

Understanding statistical power is important for researchers, funders, and the education science field at large. This course will introduce the major concepts in power analysis (e.g., types of errors) and how they relate to statistical analyses. The course lectures will cover the treatment-control design for simple random samples (SRSs), SRSs with covariates, multilevel models (MLMs), and MLMs with covariates. Activities will include calculating power, sample sizes, and effect sizes using interactive computer programs provided by the instructor, and group activities. Discussion will include how to research assumptions and how to write about power. Free software (R and R studio) to perform power analysis will be introduced with interactive exercises. This course is appropriate for researchers involved in the planning and execution of education studies: professors at all levels, postdocs, graduate students, and staff at research firms.


PDC08: An Interactive and Case-Centered Primer on Evaluation Theories and Approaches

Instructors: Bianca Montrosse-Moorhead, University of Connecticut; Daniela C. Schroeter, Western Michigan University; Lyssa N. Wilson, Western Michigan University
Date: Friday, April 13
Time: 8 a.m. – 4 p.m. 
Location: Sheraton New York Times Square, Lower Level - Chelsea
Fee: $135

This interactive, case-centered course covers historical and contemporary theories and approaches to evaluation in diverse educational contexts.This one-day course is appropriate for graduate students, early career researchers, and evaluators with little or no prior knowledge of evaluation theories and approaches. Senior researchers and evaluators who wish to expand their knowledge and use of contemporary theories and approaches may also benefit. This course will use real-life case studies to learn about historical and contemporary theories and approaches to designing and implementing evaluation, and practice putting this knowledge to use. Brief lectures, group activities, and audience engagement will encourage participants to (a) recognize different methods-, use-, values-, and social justice–oriented evaluation theories and approaches; (b) identify strengths, limits, and opportunities associated with the various evaluation theories and approaches in differing educational contexts; and (c) apply different theories and approaches in evaluation practice. Participants completing the course will gain insight into how their own backgrounds, training, and contexts may influence their choice of or preference for particular approaches. Participants will be asked to read a case prior to the course.


PDC09: Advanced Meta-Analysis

Instructors: Terri D. Pigott, Loyola University Chicago; Joshua R. Polanin, Development Services Group; Ariel M. Aloe, University of Iowa; Ryan Williams, American Institutes for Research
Date: Friday, April 13
Time: 8 a.m. – 4 p.m. 
Location: Sheraton New York Times Square, Lower Level - Flatiron
Fee: $135

This course will introduce advanced methods in meta-analysis. Topics covered include the computation of effect sizes from complex research designs, the use of fixed versus random effects models, planning for the conduct of a meta-analysis, methods for exploring heterogeneity of effect sizes including categorical models and meta-regression, and graphical methods for the display of meta-analytic results and for testing the sensitivity of results to publication bias. The statistical package R will be used to conduct the statistical techniques discussed. Participants are encouraged to bring their own research in progress to the course. The activities will include lecture, hands-on exercises, and individual consultation. This course is designed to follow the introduction to systematic review and meta-analysis course given by the instructors in prior AERA professional development training sessions. The target audience are those researchers who have systematic review and meta-analysis experience but need more in-depth knowledge of how to conduct and interpret models of effect size. Knowledge of basic descriptive statistics is assumed. Students are required to bring a laptop computer.


PDC10: Designing Surveys for Education Research

Instructors: Jill Walston, American Institutes for Research; Jeremy Redford, American Institutes for Research
Date: Friday, April 13
Time: 8 a.m. – 4 p.m. 
Location: Sheraton New York Times Square, Lower Level - Gramercy
Fee: $135

Surveys are a common tool for collecting data in education research. Surveys of students, parents, teachers, school administrators, district leaders, and others provide much of the data used to assess needs, investigate relationships, and evaluate programs and interventions. Survey data contribute to education theory development, support policy decisions, and help inform problems of practice in all areas of education research. The field of survey research offers evidence-based methods for designing and administering effective surveys. This course draws from well-known texts and literature from the field of survey methodology and offers an overview of survey development techniques, best practices, and resources that will be useful to education agency staff. The course is designed for State Education Agency and Local Education Agency staff but is also appropriate for other practitioners and researchers who want to advance their understanding of and ability to create high-quality surveys. The course covers the basics of planning for a survey project, exploring existing item sources, writing high-quality survey items, and pretesting items, as well as sampling considerations, data collection methods, and strategies for maximizing response rates. The course includes engaging presentations, small-group activities to practice skills, a binder with useful resource materials, and time for discussion and interaction with the presenters.


PDC11: Exploring a Data-Informed Approach to the Development of Students’ Social-Emotional Competencies

Instructors: Rolf K. Blank, NORC at the University of Chicago; Katie H. Buckley, Transforming Education; Bob LaRocca, Transforming Education
Date: Friday, April 13
Time: 8 a.m. – 4 p.m. 
Location: Sheraton New York Times Square, Lower Level - Bowery
Fee: $135
This course is also available via live webcast and from the AERA Virtual Research Learning Center (VRLC).

The purpose of this course is to provide researchers, educators, and administrators with best practices in assessing and supporting student social-emotional learning (SEL) skills. Participants will have the opportunity to explore measures designed for different uses, and SEL-related strategies for improving student outcomes. The course will begin with a discussion of the research and rationale behind measuring SEL and an examination of current assessments and their differing purposes. The second half of the course will feature a discussion on how to use resulting data from SEL measures (at the classroom, school, and system levels) and will model concrete, hands-on strategies and tools to strengthen student skills.


PDC12: Writing Persuasive Mixed Methods Research Proposals

Instructors: Cheryl-Anne Nadine Poth, University of Alberta; Elizabeth G. Creamer, Virginia Polytechnic Institute and State University
Date: Friday, April 13
Time: 8 a.m. – 4 p.m. 
Location: Sheraton New York Times Square, Lower Level - Sugar Hill
Fee: $135

Are you a scholar, student, or practitioner with a mixed methods study idea but not sure how to write a persuasive proposal? If so, then this course is for you! Through integrating a practical and learner-responsive approach, this course will help maximize the persuasiveness of your proposals by developing your competencies in conceptualizing, designing, and communicating mixed methods research proposals. The course is organized around three key questions that also point to the learning objectives: (a) What distinguishes persuasive mixed methods research proposals from other types of research proposals in terms of quality criteria and feasibility considerations? (b) What writing features should be incorporated to persuade target audiences of mixed methods research proposals? and (c) How can researchers avoid common pitfalls seen in mixed methods research proposals? Participants are encouraged to bring a study idea or preliminary draft of a proposal that can be further developed during the course, as opportunities for small-group work and feedback from participants and instructors will be embedded.


PDC13: Analyzing NAEP Data Using R

Instructors: Emmanuel Sikali, U.S. Department of Education; Paul Bailey, American Institutes for Research; Lauren Harrell, National Center for Education Statistics; Ting Zhang, American Institutes for Research
Date: Friday, April 13
Time: 8 a.m. – 4 p.m. 
Location: Sheraton New York Times Square, Lower Level - Murray Hill
Fee: $135

This course will introduce the unique design features of National Assessment of Educational Progress (NAEP) data to researchers and provide guidance in data analysis strategies that they require, including the selection and use of appropriate plausible values, sampling weights, and variance estimation procedures (i.e., jackknife approaches). The course will provide participants with hands-on practice training in analyzing public-use NAEP data files using the R package EdSurvey, which was developed for analyzing national and international large-scale assessment data with complex psychometric and sampling designs. The knowledge and analytic approach presented in this course can be applied to analyzing other large-scale data with plausible values. This course is designed for individuals in government, universities, the private sector, and nonprofit organizations who are interested in learning how to analyze large-scale assessment data, especially NAEP and NAEP-like data. Participants should have at least basic knowledge of R software (e.g., have taken an entry level training on R programming) as well as statistical techniques including statistical inference and multiple regression. Having working knowledge of Item Response Theory and sampling theory is preferred. Participants need to bring a laptop preloaded with the latest version of the R software to participate in the hands-on portion.


PDC14: How to Get Published: Guidance From Emerging and Established Scholars

Instructors: Patricia A. Alexander, University of Maryland; Lauren M. Singer-Trakhman, University of Maryland; Sophie Jablansky, University of Maryland; Yuting Sun, University of Maryland; DeLeon Gray, North Carolina State University; Matthew McCrudden, University of Wellington; Panayiota Kendeou, University of Minnesota; Jeffrey R. Harring, University of Maryland; Diane Schallert, University of Texas, Austin; Sofie Loyens, University of North Carolina, Chapel Hill; Jeffrey A. Greene, University of North Carolina, Chapel Hill
Date: Friday, April 13
Time: 8 a.m. – 4 p.m. 
Location: Sheraton New York Times Square, Lower Level - Union Square
Fee: $135

This course will provide graduate students and early career faculty with critical information about how to publish their research in refereed journals and other scholarly venues. This course will provide an overview of the process of navigating the publishing process, from developing a research program to utilizing mentors, organizing a manuscript, and ultimately publishing research findings. Emerging scholars who have unlocked the keys to success and senior scholars who have editorial expertise and who have mentored others new to the publication arena will share key information. Participants will engage in question-and-answer sessions with both the emergent scholars and the more seasoned veterans of academic publishing.


Mini Courses

PDC15: Explaining Growth Models and Their Use in Accountability Systems for States, Districts, and Education Researchers

Instructors: Elias Walsh, Mathematica Policy Research; Mariesa Herrmann, Mathematica Policy Research
Date: Saturday, April 14
Time: 8 a.m. – 12 p.m. 
Location: Sheraton New York Times Square, Lower Level - Chelsea
Fee: $65

This course will include lecture and group discussion aimed at explaining technically complex topics related to growth models and accountability systems. The goal of the course is to help state and district decision makers as well as education researchers who partner with states and districts to improve education evaluation systems. The course will translate complex ideas related to evaluation measures, point out where choices have important consequences for incentives, and frame key decisions in the context of state and district goals. The course is appropriate for state and district policy makers and education researchers who work with states and districts on issues related to measuring teacher and school effectiveness and designing evaluation and accountability systems.


PDC16: Using NAEP Data on the Web for Educational Policy Research

Instructors: Debra Kline, Educational Testing Service; Edward M. Kulick, Educational Testing Service; Emmanuel Sikali, U.S. Department of Education
Date: Saturday, April 14
Time: 8 a.m. – 12 p.m. 
Location: Sheraton New York Times Square, Lower Level - Sugar Hill
Fee: $105
This course is also available via live webcast and from the AERA Virtual Research Learning Center (VRLC).

This course is for researchers interested in using the NAEP Data Explorer Web tool. Participants will be guided through an examination of the NAEP data, focusing on (1) student, teacher, and school variables; (2) relationships between student performance and characteristics of students, their teachers, and their schools; and (3) using NAEP data to supplement other educational research. The course provides hands-on learning and active participation. Participants will work independently and share their findings with the group. The course will highlight the newly released 2017 NAEP math and reading data. 


PDC17: Writing an Application for an IES Research Grant

Instructors: Allen Ruby, Institute of Education Sciences; Meredith J. Larson, National Center for Education Research
Date: Saturday, April 14
Time: 8 a.m. – 12 p.m. 
Location: Sheraton New York Times Square, Lower Level - Union Square
Fee: No Fee: By Application Only

This course will provide instruction on writing a successful grant application to the Institute of Education Sciences (IES) Education Research Grants Program (84.305A) and Special Education Research Grants Program (84.324A). The course will focus on (1) the research topics, (2) the research goal structure, and (3) the four sections of the Research Narrative (Significance, Research Plan, Personnel, and Resources) that comprise the most important part of the IES grant application. If there is also a demand for information on the IES grant programs that support partnerships between researchers and practitioners (84.305H, 84.305L, and 84.324L), the course will also address these grant programs, including the fifth section of the Research Narrative (Partnership). Course leaders will introduce specific concepts and examples of strategies for writing key sections of the Project Narrative (e.g., introduction to the proposal, intervention description, theory of change, research design, analysis, and personnel), as well as examples of common errors and how to avoid them. Participants will be asked to submit initial drafts of some of these key sections and will work in small groups to revise them based on course leader and participant feedback. This course is by application only.


PDC18: Advanced Analysis Using Adult International Large-Scale Assessment Databases: PIAAC

Instructors: Eugenio Gonzalez, Educational Testing Service
Date: Saturday, April 14
Time: 1 p.m. – 5 p.m. 
Location: Sheraton New York Times Square, Lower Level - Murray Hill
Fee: No Fee. Funding for this PD Course is provided through grant R305B160020 to Educational Testing Service (ETS) from the Institute of Education Sciences (IES), U.S. Department of Education.

This course covers how to download and prepare the public use data files from the Pprogramme for the International Assessment of Adult Competencies (PIAAC) study and conduct basic and advanced statistical analysis. PIAAC assessed literacy, numeracy and problem solving skills in the adult population of over 25 countries. The course will cover the statistical complexities and techniques used in these studies and their implications for analysis through a combination of lectures and hands on exercises. Participant will learn to prepare the data files for advanced statistical analysis and conduct basic and some advanced analysis using customized software provided during the course. Information on how to access the web based tools will be presented. Participants need to have with them a laptop computer with Windows and SPSS or SAS installed, and must have knowledge of basic and intermediate statistics. Data files and demonstration software to use the data will be distributed during the course.


PDC19: Using Empowerment Evaluation with Evaluation Worksheets from Google to Build Evaluation Capacity and Learning Communities

Instructors: David M. Fetterman, Fetterman and Associates; Jason Ravitz, Google
Date: Saturday, April 14
Time: 1 p.m. – 5 p.m. 
Location: Sheraton New York Times Square, Lower Level - Chelsea
Fee: $105

Employing lecture, activities, demonstration, and discussion, this course introduces participants to empowerment evaluation and a more detailed evaluation planning process developed at Google. It presents how these approaches work together to enhance each other, and introduces technological tools to facilitate the process, utilizing methods from courses already taught at Google and Pacifica Graduate Institute. Empowerment evaluation builds capacity and fosters program improvement by teaching people how to evaluate their own programs. Key concepts include having a critical friend, cycles of reflection and action, and a community of learners. These can enhance efforts to produce measurable outcomes and demonstrate accountability. Basic steps include: 1) establishing a mission; 2) taking stock – rating performance; and 3) planning for the future – establishing goals and strategies to accomplish objectives. A dashboard is used to compare annual goals with quarterly progress. The role of the evaluator is one of a facilitator, while the team is in charge of the evaluation. Course participants learn to combine the strengths of empowerment evaluation with more detailed worksheet planning. There are no prerequisites. The course is designed for all levels, with optional pre-assignments, course exercises, and post-assignments.


PDC20: Improving the Quality of Research Measures: Lessons from the Standards for Educational and Psychological Testing

Instructors: Joan Herman, UCLA/CRESST; Linda L. Cook, Educational Testing Service; Frank Worrell, UC Berkeley
Date: Saturday, April 14
Time: 1 p.m. – 5 p.m. 
Location
: Sheraton New York Times Square, Lower Level - Union Square
Fee: $65

This course provides knowledge to document and improve the quality of research measures. Based on the latest edition of Standards for Educational and Psychological Testing (AERA/APA/NCME, 2014), course content will highlight the importance of measurement quality in research and will particularly highlight standards for validity and fairness and their implications for measure development and/or selection. Course content will draw on examples from the research literature to help course participants understand core principles. Hands-on activities will be provided to help participants apply the Standards to improve the quality of their measures. At the end of the course, participants will be familiar with the content of the Standards and understand and be able to apply those Standards particularly relevant to research measures. Participants will be asked to critique examples of research studies using the Standards and engage in simulated selection/development and validation activities. Prerequisite skills required by the course are a general familiarity with the ideas and vocabulary associated with simple measurement concepts. The target audience for the course is advanced graduate students, early career scholars, and senior researchers and practitioners who are interested in strengthening their understanding of how the Standards can be used to improve their research.


PDC21: What Would it Take to Change Your Inference? Quantifying the Discourse about Causal Inferences in the Social Sciences

Instructors: Kenneth Frank, Michigan State University
Date: Saturday, April 14
Time: 1 p.m. – 5 p.m. 
Location: Sheraton New York Times Square, Lower Level - Sugar Hill
Fee: $105
This course is also available via live webcast and from the AERA Virtual Research Learning Center (VRLC).

Participants will learn the fundamentals of inferences from quantitative analysis in the social sciences, including Rubin’s causal model (the counterfactual) and statistical control through regression. Using these fundamentals, they will learn to characterize the robustness of statistical inferences and violations of assumptions from quantitative analyses. They will learn to characterize the robustness of inferences from a regression in terms of correlations associated with an omitted variable (assumed to be zero in making inferences from a regression). Participants will learn not only how to conceptualize the robustness of an inference but also how to calculate the sensitivity of inferences from general linear models using spreadsheets (Excel) or macros in SPSS, SAS, or STATA. Participants will learn how to apply the techniques to concerns about internal and external validity, as well extensions of the techniques to logistic regression and multilevel models. Participants will learn to employ a language for articulating the robustness of inferences that can be applied to their own analyses or to inferences in the literature.


PDC22: Advanced Analysis Using School-Based International Large-Scale Assessment Databases: TIMSS, PIRLS, and PISA

Instructors: Eugenio Gonzalez, Educational Testing Service
Date: Sunday, April 15
Time: 8 a.m. – 12 p.m. 
Location: Sheraton New York Times Square, Lower Level - Sugar Hill
Fee: $105

This course covers how to download and prepare the public use data files from TIMSS, PIRLS, and PISA and conduct basic and advanced statistical analysis. It will cover the statistical complexities and techniques used in these studies and their implications for analysis. It involves a combination of lectures and hands-on exercises. Participants will learn to prepare the data files for advanced statistical analysis and conduct basic and some advanced analysis using customized software provided during the course. Participants need to have with them a laptop computer with Windows and SPSS or SAS installed, and must have knowledge of basic and intermediate statistics. Data files and demonstration software to use the data will be distributed during the course.


PDC23: Evaluating for Improvement: Assessing the Implementation and Effectiveness of Strategies for Change Within a Continuous Improvement Cycle

Instructors: Sara E. Woodruff, Mathematica Policy Research, Inc; Alexandra Resch, Mathematica Policy Research, Inc.
Date: Sunday, April 15
Time: 8 a.m. – 12 p.m. 
Location: Sheraton New York Times Square, Lower Level - Union Square
Fee: $105

This course will introduce participants to “evaluation for improvement,” a methodology for helping state and district education agencies integrate data and evidence into their continuous improvement efforts. The course will walk through the critical steps to design and carry out two types of formative assessments: “road tests” to improve implementation of a change strategy and “rapid cycle evaluation” to assess the effectiveness of a change strategy. The course will consist of short lectures interspersed with interactive, small-group activities that engage participants in the process of identifying opportunities, planning, and conducting evaluation for improvement. Participation will include a hands-on demonstration and practice with a free online toolkit for planning and conducting rapid cycle evaluations. Participants will learn to recognize opportunities to embed data and evidence into the normal course of improvement efforts, understand the steps to plan and conduct evaluation for improvement, and get hands-on experience planning a hypothetical evaluation for improvement through small-group activities. This course is appropriate for researchers and evaluators with a basic understanding of research methodology and program evaluation in education, and who wish to develop skills for working with states, districts, or schools on continuous improvement efforts.


PDC24: Strategies for Sharing Your Research

Instructors: Jenny Grant Rankin, University of Cambridge
Date: Sunday, April 15
Time: 8 a.m. – 12 p.m. 
Location: Sheraton New York Times Square, Lower Level - Chelsea
Fee: $65

This course will help participants present on a variety of stages—be they live, recorded, or written—and rock those stages. This course is based on content the presenter teaches at the University of Cambridge on how to best communicate research findings and was inspired by her experiences sharing findings with large, diverse audiences. The course is appropriate for scholars who have researched (or are currently researching) any topic within the education field and who do not want their findings to languish on a forgotten shelf or webpage. Participants will learn about a variety of presentation opportunities, how to land those opportunities, and strategies to maximize those opportunities to share their work with varied audiences. The course is split into three sections: laying the groundwork (branding, websites, social media, etc.), speaking (TED Talks, conferences, media interviews, NPR/radio, etc.), and writing (book deals, journals, magazines, etc.). Sections involve attendee participation, interaction, and hands-on activities to apply concepts to participants’ circumstances. Attendees will learn to leverage resources available to women and underrepresented minorities so more diverse perspectives are represented in field dialogue. In addressing significant professional development issues (e.g., writing and speaking strategies), this course will encourage more dynamic, memorable research presentations, and accessible, widespread communication of education research findings


PDC25: Systematic Review: Tool for a Scholarly Critique of Literature

Instructors: Barbara Eleanor Polnick, Sam Houston State University; Beverly J. Irby, Texas A&M University - College Station; Margaret Foster; Julia Nell Ballenger, Texas A&M University - Commerce; Nahed Abdelrahman, Texas A&M University
Date: Sunday, April 15
Time: 8 a.m. – 12 p.m. 
Location: Sheraton New York Times Square, Lower Level - Murray Hill
Fee: $105

This course will help graduate students and early career faculty to understand and gain the skill of analyzing the literature using a systematic review analysis. Participants will conduct and evaluate systematic review studies within six modules: (a) formulate the research question relevant to systematic review; (b) locate and appraise the data; (c) collect data using keywords in the databases; (d) import the data all to Refworks; (e) create a Qualtrics database to analyze the data and develop the codes and themes for their topics; and (f) interpret the results. Participants will have simulated topics for the development of understanding of how to engage in a systematic review; however, participants also will use their topics in beginning their own systematic review which can lead to a publication. The participants should have access to Refworks and/or Qualtrics or at least should be able to create accounts on Refworks and Qualtrics prior to attending the course.


 

PDC27: Designing Adequately Powered Cluster Randomized Trials to Detect Main Effects, Moderation, and Mediation

Instructors: Jessaca K. Spybrook, Western Michigan University; Benjamin Kelcey, University of Cincinnati; Nianbo Dong, University of Missouri - Columbia; Kyle T. Cox, University of Cincinnati
Date: Monday, April 16
Time: 8 a.m. – 12 p.m. 
Location: Sheraton New York Times Square, Lower Level - Sugar Hill
Fee: $105

The purpose of this course is to train researchers and evaluators in how to plan efficient and effective cluster-randomized studies that probe hypotheses concerning main effects, mediation, and moderation. Participants use the free PowerUp! software programs designed to estimate the statistical power to detect mediation, moderation, and main effects across a wide range of designs. The course will combine lecture with hands-on practice with the free software programs. The target audience includes graduate students, researchers, and evaluators interested in planning and conducting multilevel studies that investigate mediation, moderation, or main effects.


PDC28: Mixed Methods Data from the Stanford Civic Purpose Project

Instructors: Heather Malin, Stanford University; William Damon, Stanford University; Susan Jekielek, University of Michigan; David Bleckley, University of Michigan; Emily Morton, Stanford University; Krystal Ann Smith, Stanford University
Date: Monday, April 16
Time: 8 a.m. – 12 p.m. 
Location: Sheraton New York Times Square, Lower Level - Union Square
Fee: $105

This course explores mixed methods research and secondary data analysis, using the Stanford Civic Purpose Project to drive course content. The Project produced quantitative and qualitative data on the civic experiences, attitudes, and motivations of young people, and these data are shared through the CivicLEADS archive at ICPSR at the University of Michigan. Lectures will focus on the study’s conceptualization, mixed methods, and data collection. Through a hands-on demonstration, participants will learn how to access the Project’s data from the CivicLEADS archive and use CivicLEADS’s online resources to discover, explore, and analyze data. Instructors will lead participants in group discussions regarding the potential for secondary analysis. Through these group discussions, participants will network with peers, formulate potential research questions, and lay the groundwork for future research and collaboration. Prerequisite skills or knowledge needed for course participation include interest in mixed methods or qualitative research in the areas of civic education, civic purpose, and/or civic development, as well as an understanding of secondary data analysis.


PDC29: Introduction to Infographics and Data Visualization

Instructors: Dino Sossi, Teachers College, Columbia University
Date: Monday, April 16
Time: 1 p.m. – 5 p.m. 
Location: Sheraton New York Times Square, Lower Level - Murray Hill
Fee: $65

This course introduces education researchers to design principles and techniques for effective data visualization focusing on qualitative data. Visualizations graphically depict data to foster communication, improve comprehension, and enhance decision making. This course will help educators and researchers at all levels learn to create their own visualizations to communicate complex educational issues and give them the opportunity to begin to learn how to teach data visualization techniques to their own classes.


PDC30: Introduction to Quantile Regression for Education Research

Instructors: Jessica Logan, The Ohio State University - Columbus
Date: Monday, April 16
Time: 1 p.m. – 5 p.m. 
Location: Sheraton New York Times Square, Lower Level - Sugar Hill
Fee: $105

This course provides a conceptual introduction to quantile regression, guidance in crafting research questions suitable for quantile regression, hands-on practice using provided data and participants’ data, and an overview of best practices in model building and presentation of results for publication. Quantile regression can analyze and accurately describe heteroschedastic relationships, which are common when examining development of new skills (floor effects) or skill mastery (ceiling effects). It allows for more complex research questions to be asked of the data, such as comparing whether the strength of a relation is significantly greater at the high or low end of the outcome. Come to this course if you are at any stage of your career. Participants should have a working knowledge of R or SAS, have experience analyzing and interpreting data using regression or regression-based models, and want to learn how to fit quantile regression within their research area. Participants should bring a laptop with SAS or R and Rstudio installed, and their own de-identified data set in a .csv format.


PDC31: Student Privacy 101 for Researchers

Instructors: Amelia Vance, Future of Privacy Forum; Michael Hawes, U.S. Department of Education; Elana Zeide, Seton Hill University; Rachel Anderson, Data Quality Campaign
Date: Monday, April 16
Time: 1 p.m. – 5 p.m. 
Location: Sheraton New York Times Square, Lower Level - Union Square
Fee: $65

Since 2013, 39 states have passed 123 student privacy laws, many of which have direct or indirect implications for researchers. Just in 2017, 45 student privacy bills were introduced that directly mentioned education research, and Congress has introduced legislation that could curtail education research significantly. In this climate, it is essential that researchers understand student privacy and how they can both protect student data—beyond IRBs—and communicate about their privacy practices. This course will provide an overview of federal student privacy laws and state student privacy law trends that affect researchers; a comparative look at how researchers in other areas deal with privacy concerns; and best practices researchers can follow to alleviate student privacy concerns. This information will be conveyed through a mixture of panels, audience participation, and interactive case studies. Participants will come away knowing the most important student privacy issues and legislative trends for researchers and how to improve their own privacy practices in order to protect student privacy and protect themselves from student privacy perception problems.
 
 
Virtual Research Learning Center
Print
 AERA Funding Opportunities PD Opportunities Other Fellowship and Funding Opportunities AERA Online Job Board AERA Virtual Research Learning Center AERA Virtual Research Learning Center

The AERA Virtual Research Learning Center (VRLC) is a virtual space for students, early career and advanced scholars, practitioners, and others in the education research community to receive professional development and research capacity building trainings.

 
Designed by Weber-Shandwick   Powered by eNOAH