2017 Annual Meeting Professional Development Courses
2017 Annual Meeting Professional Development Courses
 
Print
 

Professional Development and Training
Courses for 2017 Annual Meeting
Courses Start April 27, 2017


The AERA Professional Development and Training Committee is pleased to offer a robust program of courses for the 2017 AERA Annual Meeting in San Antonio, TX. The program was crafted based on consideration of more than 50 submissions and a competitive peer review process. Courses are designed at various levels (e.g., basic, intermediate, advanced) to reach graduate students, early career scholars, and other researchers who seek to increase their knowledge and enhance research skills.


Jump to the course list

Course Dates: 

  • The extended courses are held on Thursday, April 27, or on Friday, April 28.
  • The mini-courses will occur Friday through Sunday, April 28–30.

Registration: 

  • Potential participants can register for Professional Development and Training Courses by logging into and updating their 2017 AERA Annual Meeting registration.
  • You must be registered for the Annual Meeting in order to purchase tickets to attend courses.

Materials: 

  • Course participants should bring a laptop with any software suggested or specified in the course description.

Questions: 

  • Direct questions about the professional development and training courses to profdevel@aera.net.

Click a course number to learn more.

Extended Courses
  • PDC01: Analyzing NAEP Assessment Data Using R

  • PDC02: Autoethnography: Bringing the Known into Action to Create More Equitable Opportunity

  • PDC03: Coding Qualitative Data: A Survey of Selected Methods

  • PDC04: Creative Strategies for Mixing Qualitative and Quantitative Approaches in Mixed Methods Research

  • PDC05: Designing Surveys for Education Research

  • PDC06: Hierarchical Linear Modeling with Large-Scale International Databases

  • PDC07: How to Get Published - Guidance from Emerging and Established Scholars

  • PDC08: Introduction to Systematic Review and Meta-Analysis

  • PDC09: Propensity Score Methods and Their Applications in Observational Studies

  • PDC10: Reimagining School Accountability under ESSA: Opportunities and Challenges for Evaluating School Quality and Student Success

  • PDC11: Researchers' Learning Exchange: Interrogating Research through Historical and Socio-cultural Lenses

  • PDC12: Advanced Meta-Analysis
Mini Courses
  • PDC13: Reimagining Evaluation Capacity Building: Using Empowerment Evaluation with Evaluation Worksheets from Google

  • PDC14: Strategies for Sharing Your Research

  • PDC15: Using R Software for Item Response Theory (IRT) Model Calibrations

  • PDC16: Using NAEP Data on the Web for Educational Policy Research

  • PDC17: Using the AERA/APA/NCME Standards for Educational and Psychological Testing to Improve the Quality of Education Research

  • PDC18: What Would It Take to Change Your Inference? Quantifying the Discourse about Causal Inferences in the Social Sciences

  • PDC19: Accessing and Exploring NCES Data

  • PDC20: Enhancing the Validity Argument of Assessments: Identifying, Understanding, and Mitigating Unintended Consequences of Test Use

  • PDC21: Qualitative Analysis Software Data Jam: MAXQDA Hands-On

  • PDC22: Research Toward Equity and Justice: The Gordon Paradigm of Inquiry and Practice

  • PDC23: Building Researchers' Capacity to Partner with Practitioners to Conduct Relevant and Useful Research

  • PDC24: The Stanford Education Data Archive: Using Big Data to Study Academic Performance

  • PDC25: Using Multilevel Modeling to Meta-Analyze Single-Case Experimental Design Studies' Results

  • PDC26: Writing an Application for an IES Grant

  • PDC27: Storing, Managing, Coding, and Sharing Video Data with Databrary

Extended Courses

PDC01: Analyzing NAEP Assessment Data Using R
Instructors: Ting Zhang, American Institutes for Research; Emmanuel Sikali, U.S. Department of Education; Paul Dean Bailey, American Institutes for Research; Lauren Harrell, National Center for Education Statistics
Date: Thursday, April 27, 2017
Time: 8:00am – 3:45pm
Fee: $125.00
Location: San Antonio Marriott Rivercenter, Third Floor, Grand Ballroom Salon E

The primary purpose of this course is to introduce participants to the available functionality of an R package called “edsurvey” that is tailored for the process and analysis of large-scale assessment data, and to teach participants how to analyze National Assessment of Educational Progress (NAEP), or NAEP-like, data with appropriate procedures using the R package that is available for free download. The course will begin by familiarizing participants with the psychometric and sampling design of NAEP — essential knowledge in understanding and analyzing NAEP data. Afterward, the course will focus on providing hands-on practice using the R package “edsurvey”. This course is designed for individuals in government, universities, the private sector, and nonprofit organizations who are interested in learning how to analyze large-scale assessment data, especially NAEP or NAEP-like data. Participants should have at least a basic knowledge of R software (e.g., from an entry level training on R programming) as well as statistical techniques including statistical inference and multiple regression. Working knowledge of Item Response Theory and sampling theory is preferred. Participants need to bring a laptop preloaded with the latest version of the R software to participate in the hands-on portion.


PDC02: Autoethnography: Bringing the Known into Action to Create More Equitable Opportunity
Instructors: Sherick A. Hughes, University of North Carolina - Chapel Hill; Julie L. Pennington, University of Nevada - Reno
Date: Thursday, April 27, 2017
Time: 8:00am – 3:45pm
Fee: $125.00
Location: San Antonio Marriott Rivercenter, Third Floor, Grad Ballroom Salon F

The purpose of this course is to provide information to graduate students, prospective and novice university professors, and other participants about how to use autoethnographic research methods to incorporate the known worlds and realities of researchers and educators into actions designed to help create more equitable opportunities for all. This course will target qualitative and mixed-methods researchers longing for a methodological alternative from neutral, anonymous, and objectivist orientations to educational policy and practice. Instead of ignoring assumptions or viewing them as limitations, autoethnography allows educational researchers to unravel oftentimes concealed perspectives in order to move other researchers and educators to action on behalf of educational equity issues and concerns. Information will be provided related to understanding the history of autoethnographic work as well designing, completing, and representing autoethnographic research in ways that push for justice and equity. Participants will leave our course equipped to begin the process of adding autoethnography as an alternative tool in their methodological toolkits. The course will specifically focus on establishing a community of critical colleagues and friends.


PDC03: Coding Qualitative Data: A Survey of Selected Methods
Instructors: Johnny Saldana, Arizona State University
Date: Thursday, April 27, 2017
Time: 8:00am – 3:45pm
Fee: $125.00
Location: San Antonio Marriott Rivercenter, Third Floor, Grand Ballroom Salon I

This course will review methods of coding qualitative data, collected from various publications in education and the social sciences. Participants will explore basic coding and recoding principles, apply them to the analysis of qualitative data, and discuss their transfer to relevant future projects. The primary goals of the course are to acquaint participants with sources and methods of coding, including factors that influence and affect coding selections; approaches to data analysis; and writing analytic memos. Manual (hard copy) coding will be emphasized with a discussion of available CAQDAS (software) for future use. The course will focus comprehensively on eight methods: 1. Descriptive Coding; 2. In Vivo Coding; 3. Process Coding; 4. Values Coding; 5. Emotion Coding; 6. Versus Coding; 7. Dramaturgical Coding; and 8. Causation Coding. Target audiences for the course include graduate students in the initial stages of their qualitative research projects, and professors instructing qualitative research methods courses. Participants should have an introductory knowledge of qualitative research/inquiry and its canon of data collection methods.


PDC04: Creative Strategies for Mixing Qualitative and Quantitative Approaches in Mixed Methods Research
Instructors: Elizabeth G. Creamer, Virginia Polytechnic Institute and State University; Cherie D. Edwards, Virginia Polytechnic Institute and State University
Date: Thursday, April 27, 2017
Time: 8:00am – 3:45pm
Fee: $125.00
Location: San Antonio Marriott Rivercenter, Third Floor, Gran Ballroom Salon H

This interactive, problem-centered course introduces creative ways to mix qualitative and quantitative approaches in a mixed methods research project. Led by a faculty member with over fifteen years of experience teaching introductory and advanced graduate-level research methods courses in mixed methods, this one-day course is targeted toward graduate students and early career researchers with little or no prior knowledge of mixed methods research practice. The purpose of the course is to use a problem-oriented activity related to critical incidents and to use visual methods to generate creative ideas about synergistic ways to integrate qualitative and quantitative data during analysis. Objectives for the course include (a) reviewing the distinctions between multi-method, mixed method, and fully integrated mixed method research; (b) distinguishing major reasons for using a mixed method approach; (c) presenting dialectical pluralism as the paradigmatic grounding for mixed method research; (d) describing strategies for mixing at all phases of the research process, including at the design and sampling stage, and (e) demonstrating a way to capture mixing through the construction of meta-inferences. Participants completing the course will take away ideas about ways to design a mixed methods research study. Participants will find it useful to bring a computer. A pre-course assignment is to read: Creamer, E. G. (2016). A primer about mixed methods research in an educational context. International Journal of Learning, Teaching, and Education Research, 15(8), 1-13.


PDC05: Designing Surveys for Education Research
Instructors: Jill Walston, American Institutes for Research; Jeremy Redford, American Institutes for Research; Mark Masterton, American Institutes for Research
Date: Thursday, April 27, 2017
Time: 8:00am – 3:45pm
Fee: $125.00
Location: San Antonio Marriott Rivercenter, Third Floor, Conference Room 3&4

Surveys are a common tool for collecting data in education research. Surveys of students, parents, teachers, school administrators, district leaders, and others provide much of the data used to assess needs, investigate relationships, and evaluate programs and interventions. Survey data contribute to education theory development, support policy decisions, and help inform problems of practice in all areas of education research. The field of survey research offers evidence-based methods for designing and administering effective surveys. This course draws from well-known texts and literature from the field of survey methodology and offers an overview of survey development techniques, best practices, and resources that will be useful to education agency staff. The course is designed for State Education Agency and Local Education Agency staff but is also appropriate for other practitioners and researchers who want to advance their understanding of and ability to create high-quality surveys. The course covers the basics of planning for a survey project, exploring existing item sources, writing high-quality survey items, and pretesting items, as well as sampling considerations, data collection methods, and strategies for maximizing response rates. The course includes engaging presentations, small group activities to practice skills, a binder with useful resource materials, and time for discussion and interaction with the presenters.


PDC06: Hierarchical Linear Modeling with Large-Scale International Databases
Instructors: David C. Miller, American Institutes for Research; Francis Howard Lim Huang, University of Missouri; Sakiko Ikoma, American Institutes for Research; Sabine Meinck, IEA Data Processing and Research Center; Bitnara Jasmine Park, American Institutes for Research; Austin Lasseter, Summit Consulting LLC; Yuan Zhang, University of Pittsburgh
Date: Thursday, April 27, 2017
Time: 8:00am – 3:45pm
Fee: $125.00
Location: Henry B. Gonzalez Convention Center, Ballroom Level, Room 303 C

Data from large-scale international studies reflect the nested structure of education systems and are, therefore, very well suited for hierarchical linear modeling (HLM). However, because these data come from complex cluster samples, there are methodological aspects that a researcher needs to understand when doing HLM (e.g., the need for using sampling weights and multiple achievement values for parameter estimation). This course will teach participants how to do HLM with data from large-scale studies, such as PIRLS, TIMSS, and PISA. The content of the course will include a presentation on the design of large-scale international studies and databases and implications for HLM analysis. Participants will learn how to specify simple two- and three-level models using the HLM 7 software package, incorporating students, classes or schools, and countries as hierarchical levels. There will be hands-on demonstrations about how to prepare an international dataset (using SPSS) for analysis with HLM 7 software and how to perform various HLM analyses. Participants will have the chance to work on practice exercises with several instructors available to answer questions. Participants should have a solid understanding of OLS regression and a basic understanding of hierarchical/multi-level models. Prior experience using a statistical software package, such as Stata or SPSS, is helpful. Prior knowledge about large-scale international studies or prior experience using the respective databases or HLM software is not required. To participate in the hands-on demonstrations and example analyses, participants should bring their own laptops with HLM software (a free student version is available).


PDC07: How to Get Published - Guidance from Emerging and Established Scholars
Instructors: Patricia A. Alexander, University of Maryland - College Park; Lauren M. Singer, University of Maryland - College Park; Sophie Jablansky, University of Maryland - College Park; Yuting Sun, University of Maryland - College Park; DeLeon Lavron Gray, North Carolina State University; Matthew T. McCrudden, Victoria University of Wellington; Panayiota Kendeou, University of Minnesota; Laura M. Stapleton, University of Maryland; Diane L. Schallert, The University of Texas at Austin; Sofie Loyens, University College Roosevelt; Jeff A. Greene, University of North Carolina - Chapel Hill
Date: Thursday, April 27, 2017
Time: 8:00am – 3:45pm
Fee: $125.00
Location: Henry B. Gonzalez Convention Center, Ballroom Level, Room 305

This course will provide graduate students and early career faculty with critical information about how to publish. The course will begin with an overview of the nuts and bolts of academic publishing by Dr. Patricia Alexander. Following this, scholars will present detailed sessions that cover the entire publishing process, from conceptualizing studies to preparing well-crafted manuscripts targeted to relevant journals. More specifically, sessions led by top scholars will cover: institutional and career fit, how to be a productive writer, finding equilibrium in academia, contemporary publishing topics and how to select appropriate journals, quality quantitative research, quality qualitative research, and ethical issues in publishing. Question and answer sessions will follow each presentation to allow course participants to interact with the scholars about the topics presented. There will be a final presentation by Dr. Alexander and Dr. Schallert that addresses any lingering questions and concerns. Each participant will be provided with materials, including handouts and work samples, that elaborate on the important points shared during the session.


PDC08: Introduction to Systematic Review and Meta-Analysis
Instructors: Amy L. Dent, Harvard University; Terri D. Pigott, Loyola University Chicago; Joshua R. Polanin, Development Services Group; Joseph Taylor, Biological Sciences Curriculum Study (BSCS)
Date: Thursday, April 27, 2017
Time: 8:00am – 3:45pm
Fee: $125.00
Location: Grand Hyatt San Antonio, Fourth Floor, Texas Ballroom Salon D

This one-day course will introduce the basics of systematic review and meta-analysis. Topics covered include developing a research question, searching the literature, evaluating and coding studies, conducting a meta-analysis, and interpreting results for various stakeholders. Participants are encouraged to bring an idea for a systematic review to the course, with time reserved for discussion about it with course instructors. Course activities will include lecture, hands-on exercises, small group discussion, and individual consultation. The target audience includes both those new to systematic review and meta-analysis as well as those currently conducting either type of project. Knowledge of basic descriptive statistics is assumed. Participants are required to bring a laptop computer.


PDC09: Propensity Score Methods and Their Applications in Observational Studies
Instructors: Haiyan Bai, University of Central Florida; Wei Pan, Duke University; Christopher M. Swoboda, University of Cincinnati
Date: Thursday, April 27, 2017
Time: 8:00am – 3:45pm
Fee: $125.00
Location: Grand Hyatt San Antonio, Fourth Floor, Texas Ballroom Salon E

Observational studies pose threats to the validity of treatment effect estimation due to selection bias. Propensity score methods have been increasingly used as a means of reducing selection bias so as to approximate the characteristics of experimental designs. This course will introduce concepts and issues related to propensity score methods, including matching, stratification, and weighting, and will also discuss when and how to use propensity score methods in observational studies using real-world examples with large-scale, national survey data. Lectures on theory and hands-on activities with statistical software in R and Stata will benefit faculty members, graduate students, and applied researchers in improving the quality of observational studies. Instructions for downloading and installing related statistical software and example data will be provided to participants in advance through a course website. No prior knowledge of propensity score methods is required. However, an understanding of research design and basic statistics, such as t-tests and regression, is preferable. Participants are advised to bring their own laptop computers for hands-on activities.


PDC10: Reimagining School Accountability under ESSA: Opportunities and Challenges for Evaluating School Quality and Student Success
Instructors: Rolf K. Blank, NORC at the University of Chicago; Katie H. Buckley, Harvard University; Susan Lyons, National Center for the Improvement of Educational Assessment, Inc.
Date: Thursday, April 27, 2017
Time: 8:00am – 3:45pm
Fee: $125.00
Location: Grand Hyatt San Antonio, Fourth Floor, Texas Ballroom Salon F

This course provides researchers with development on the accountability and reporting requirements under the Every Student Succeeds Act (ESSA) and gives participants an opportunity to grapple with the key considerations and research supporting the selection of measures and indicators. The course will begin with discussion of the importance of an educational vision and theory of action for indicators to meet accountability requirements. While broadly applicable to all of the indicators under ESSA, special attention will be given to the indicator of school quality or student success (a.k.a., the “fifth” indicator). This indicator presents an opportunity to define and operationalize school quality and reporting in an accountability system.


PDC11: Researchers’ Learning Exchange: Interrogating Research through Historical and Socio-cultural Lenses
Instructors: Lynda Tredway, Institute for Educational Leadership; Gretchen Givens Generett, Duquesne University; Miguel Angel Guajardo, Texas State University - San Marcos; Francisco Guajardo, The University of Texas at Rio Grande Valley; Matthew C. Militello, East Carolina University; Christopher A. Janson, University of North Florida
Date: Thursday, April 27 and Friday, April 28, 2017
Time: 8:00am – 3:45pm
Fee: $140.00
Location: Grand Hyatt San Antonio, Fourth Floor, Texas Ballroom Salon G

This two-day professional learning course focuses on a deeper understanding of interrogating our research practices as activist researchers. Specifically, the Researchers’ Learning Exchange offers experiential and interactive approaches foundational to creating and implementing activist research. By emphasizing the researcher role as agent of change, the learning exchange process honors learning as a dynamic social process that engages participants in critical conversations to fuse the boundary between researchers and traditional subjects of research. By examining their identity as researchers as engaging others and as agents of change, researchers are better able to understand their role as a part of the complex ecology of self, in organizations and in the community. In turn, the researcher constructs a complex understanding of the roles of co-participants in the research process. The research design fully integrates the responsibility of addressing issues of inequities and fostering a sense of possibility that honors and builds on the assets of researchers and co-researchers., as well.


PDC12: Advanced Meta-Analysis
Instructors: Terri D. Pigott, Loyola University Chicago; Joshua R. Polanin, Development Services Group; Ryan Williams, American Institutes for Research; Ariel M. Aloe, University of Iowa
Date: Friday, April 28, 2017
Time: 8:00am – 3:45pm
Fee: $125.00
Location: San Antonio Marriott Riverwalk, Second Floor, Ballroom Level, Alamo Ballroom Salon D

This one-day course will introduce advanced methods in meta-analysis. Topics covered include the computation of effect sizes from complex research designs; the use of fixed versus random effects models; planning for the conduct of a meta-analysis; methods for exploring heterogeneity of effect sizes, including categorical models and meta-regression; and graphical methods for the display of meta-analytic results and for testing the sensitivity of results to publication bias. The statistical package R will be used to conduct the statistical techniques discussed. Participants are encouraged to bring their own research in progress to the course. The activities will include lecture, hands-on exercises, and individual consultation. This course is designed to follow the introduction to systematic review and meta-analysis course given by the instructors in prior AERA Professional Development training sessions. The target audience is researchers with systematic review and meta-analysis experience who need additional in-depth knowledge of how to conduct and interpret models of effect size. Knowledge of basic descriptive statistics is assumed. Students are required to bring a laptop computer. 


Mini Courses

 

PDC13: Reimagining Evaluation Capacity Building: Using Empowerment Evaluation with Evaluation Worksheets from Google
Instructors: David M. Fetterman, Fetterman and Associates; Jason Ravitz, Google, Inc.
Date: Friday, April 28, 2017
Time: 8:00am – 12:00pm
Fee: $55.00
Location: San Antonio Marriott Riverwalk, Second Floor, Ballroom Level, Alamo Ballroom Salon A

Employing lecture, activities, demonstration, and discussion, this course introduces participants to empowerment evaluation and a more detailed evaluation planning process developed at Google. It presents how these approaches work together to enhance each other, and introduces technological tools to facilitate the process, utilizing methods from courses already taught at Google and Pacifica Graduate Institute. Empowerment evaluation builds capacity and fosters program improvement by teaching people how to evaluate their own programs. Key concepts include having a critical friend, cycles of reflection and action, and a community of learners. These can enhance efforts to produce measurable outcomes and demonstrate accountability. Basic steps include: 1) establishing a mission; 2) taking stock – rating performance; and 3) planning for the future – establishing goals and strategies to accomplish objectives. A dashboard is used to compare annual goals with quarterly progress. The role of the evaluator is one of a facilitator, while the team is in charge of the evaluation. Course participants learn to combine the strengths of empowerment evaluation with more detailed worksheet planning. There are no prerequisites. The course is designed for all levels, with optional pre-assignments, course exercises, and post-assignments.


PDC14: Strategies for Sharing Your Research
Instructors: Jenny Grant Rankin, University of Cambridge
Date: Friday, April 28, 2017
Time: 8:00am – 12:00pm
Fee: $55.00
Also available via live webcast and from the AERA-Virtual Research Learning Center
Location: San Antonio Marriott Riverwalk, Second Floor, Ballroom Level, Alamo Ballroom Salon C

This course focuses on how to best communicate research findings. It is appropriate for researchers and scholars who have researched (or are currently researching) any topic within the education field and who do not want their findings to languish on a forgotten shelf or webpage. Participants will learn about a variety of opportunities, how to land those opportunities, and strategies to maximize those opportunities to share their work with varied audiences. The course is split into three sections: laying the groundwork (branding, websites, social media, etc.), speaking (TED Talks, conferences, media interviews, NPR/radio, etc.), and writing (book deals, journals, magazines, etc.) Sections involve participant interaction and hands-on activities to apply concepts. Participants will learn to leverage resources available to women and underrepresented minorities so more diverse perspectives are represented in field dialogue. In addressing significant professional development issues (e.g., writing and speaking strategies), this course will encourage dynamic, memorable research presentations and accessible, widespread communication of education research findings.


PDC15: Using R Software for Item Response Theory (IRT) Model Calibrations
Instructors: Ki Matlock, Oklahoma State University; Insu Paek, Florida State University; Taeyoung Kim, University at Buffalo - SUNY
Date: Friday, April 28, 2017
Time: 8:00am – 12:00pm
Fee: $95.00
Location: San Antonio Marriott Riverwalk, Second Floor, Ballroom Level, Alamo Ballroom Salon E

This interactive training course will introduce the concepts of unidimensional and multidimensional IRT models and provide instruction, demonstration, and hands-on opportunities to use the free R software to estimate commonly used IRT models. Concepts of commonly used unidimensional and multidimensional IRT models will be taught (e.g., Rasch, 1PL, 2PL, 3PL, GR, and GPC), with little focus on statistical theory. Participants will receive detailed training on how to execute the eRm, ltm, and mirt IRT packages and interpret the results, with ample opportunities for hands-on analysis. Example datasets will be provided for practical applications. Prior knowledge of R and/or IRT is not required, but a basic understanding of elementary and intermediate statistics is recommended. Familiarity with writing syntax may also be helpful for using R but is not essential. The target audience for this course includes graduate students, practitioners, and researchers interested in advancing their knowledge of IRT and enhancing their skills in using R to do IRT analysis. Participants should bring their own laptops with the free R software installed. Instructions for downloading R and installing the necessary packages will be provided prior to the course.


PDC16: Using NAEP Data on the Web for Educational Policy Research
Instructors: Debra Kline, Educational Testing Service; Edward M. Kulick, Educational Testing Service; Emmanuel Sikali, U.S. Department of Education
Date: Friday, April 28, 2017
Time: 1:00pm – 5:00pm
Fee: $95.00
Location: San Antonio Marriott Riverwalk, Second Floor, Ballroom Level, Alamo Ballroom Salon A

This course is for researchers interested in exploring NAEP data through the NAEP Data Explorer web tool. Participants will be guided through an examination of the data, focusing on 1) student, teacher, and school variables; 2) relationships between student performance and characteristics of students, their teachers, and their schools; and 3) using NAEP data to supplement other education research. Participants will have the opportunity to work independently and share their findings with the group. The course will highlight 2015 NAEP math and reading data. The course provides hands-on learning and active participation. A laptop or tablet computer with a wireless Internet card is required


PDC17: Using the AERA/APA/NCME Standards for Educational and Psychological Testing to Improve the Quality of Education Research
Instructors: Linda L. Cook, Educational Testing Service; Wayne J. Camara, ACT, Inc.; Michael T. Kane, Educational Testing Service
Date: Friday, April 28, 2017
Time: 1:00pm – 5:00pm
Fee: $55.00
Also available via live webcast and from the AERA-Virtual Research Learning Center
Location: San Antonio Marriott Riverwalk, Second Floor Ballroom Level, Alamo Ballroom Salon C


The purpose of this course is to help education researchers improve the quality of their research through application of the recently published Standards for Educational and Psychological Testing (AERA/APA/NCME, 2014). The course content will focus on the 2014 Standards with a specific emphasis on the reliability, validity, and fairness standards. The content of the course will also include selected materials that will be reviewed by course participants and used to demonstrate how the Standards can be applied and used to improve the fairness, validity, and reliability of the materials. At the end of the course, participants should be familiar with the content of the 2014 Standards and should understand and be able to apply the standards included in the Fairness, Reliability, and Validity chapters to their work. The target audience for the course is advanced graduate students, early career scholars, and senior researchers and practitioners who are interested in strengthening their understanding of how the Standards can be used to improve their research. Prerequisite skills required by the course are a general familiarity with the ideas and vocabulary associated with simple measurement concepts. The course will include both lecture and interaction. Participants will be asked to critique several examples of research studies or other materials using the reliability, validity, and fairness standards. Issues related to state testing programs, selection tests, international testing programs, and value-added models will be included in the discussion of the application of the Standards. Participants are encouraged to bring their own examples to the course for discussion; however, participants are not required to bring any material or equipment to the course.


PDC18: What Would It Take to Change Your Inference? Quantifying the Discourse about Causal Inferences in the Social Sciences
Instructors: Kenneth A. Frank, Michigan State University
Date: Friday, April 28, 2017
Time: 1:00pm – 5:00pm
Fee: $95.00
Location: San Antonio Marriott Riverwalk, Second Floor, Ballroom Level, Alamo Ballroom Salon B

In this course, participants will use Rubin’s causal model to interpret how much bias there must be to invalidate an inference (in terms of replacing observed cases with counterfactual cases or cases from an unsampled population). In part II, participants will quantify the robustness of causal inferences in terms of correlations associated with unobserved variables or in unsampled populations. Calculations for bivariate and multivariate analysis will be presented in the spreadsheet for calculating indices [KonFound-it!] with some links to SPSS, SAS, and Stata. The course format will be a mixture of presentation, individual exploration, and group work. Participants may include graduate students and professors, although all must be comfortable with basic regression and multiple regression. Participants should bring their own laptops, or be willing to work with another student who has a laptop. Participants may choose to bring to the course an example of an inference from a published study or their own work, as well as data analyses they are currently conducting.


PDC19: Accessing and Exploring NCES Data
Instructors: Stephen Quin Cornman, U.S. Department of Education; Emmanuel Sikali, U.S. Department of Education; Andrew A. White, National Center for Education Statistics; Gigi Jones, U.S. Department of Education; Douglas E. Geverdt, U.S. Census Bureau; Sarah Kathryn Grady, National Center for Education Statistics; Stephanie R. Miller, National Center for Education Statistics
Date: Saturday, April 29, 2017
Time: 8:00am – 12:00pm
Fee: $95.00
Also available via live webcast and from the AERA-Virtual Research Learning Center
Location: San Antonio Marriott Riverwalk, Second Floor, Ballroom Level, Alamo Ballroom Salon C


The National Center for Education Statistics (NCES) has several state-of-the-art data tools that allow users to easily access and analyze data. This course provides participants with a comprehensive overview of those tools to access data sets. Participants will learn how to access public-use and restricted-use data sets, create reports and data tables, find published reports, and conduct analyses in selected statistical tools. NCES offers a large variety of national, state, local, school, and student data sets including assessment data, cross-sectional data, survey data, and administrative records. Participants will understand which data sets cover their educational topic of interest. The course is designed for graduate students, faculty members, researchers, and other users with an interest in using NCES data for their research studies, evaluations, and data projects. Participants are not required to have any pre-requisite skills to attend, and should bring their personal laptops for interactive, in-class activities. Each participant will receive an Accessing and Exploring NCES Data “cookbook” that provides comprehensive instruction and screen shots explaining how to navigate six NCES data tools and one distance learning module.


PDC20: Enhancing the Validity Argument of Assessments: Identifying, Understanding, and Mitigating Unintended Consequences of Test Use
Instructors: Maria Elena Oliveri, Educational Testing Service; Cathy LW Wendler, Educational Testing Service
Date: Saturday, April 29, 2017
Time: 8:00am – 12:00pm
Fee: $95.00
Location: San Antonio Marriott Riverwalk, Second Floor, Ballroom Level, Alamo Ballroom Salon A

This course focuses on methods for identifying and mitigating unintended consequences of testing. Unintended consequences may impact the interpretation of test results and may negatively influence opportunities for individuals from diverse groups. To address these concerns, participants will be presented with a framework for identifying unintended consequences, will become familiar with tools and methods to mitigate unintended consequences, and will be presented with various models that are applicable to diverse assessment programs administered nationally and internationally. They will apply various tools, such as a Theory of Action and a validity rationale, to their own or another assessment program. A mixture of lecture, hands-on exercises, and group work will be used. The target audience includes practitioners in the measurement field at the local, state, national, and international levels; individuals responsible for managing testing programs; assessment developers; and early career scholars and advanced researchers interested in expanding theoretical and hands-on knowledge. The target audience is expected to be familiar with basic measurement concepts (e.g., validity and reliability). Participants should review the Standards for Educational and Psychological Testing prior to the course.


PDC21: Qualitative Analysis Software Data Jam: MAXQDA Hands-On
Instructors: Christian Schmieder, University of Wisconsin – Extension; Ellen Bechtol, University of Wisconsin – Extension
Date: Saturday, April 29, 2017
Time: 8:00am – 12:00pm
Fee: $55.00
Location: San Antonio Marriott Riverwalk, Second Floor, Ballroom Level, Alamo Ballroom Salon B

This course will introduce an integrated software and methods curriculum designed to foster interdisciplinary research, establish inquiry-driven affinity spaces, and build institutional frameworks for distributed analysis of large qualitative datasets (50–5,000 records). Data Jams are specifically designed to create and maintain institutionally shared methodological and qualitative data analysis software skills, thus eliminating an institution’s reliance on external software training and consulting. In this hands-on session, participants will learn how to institutionally implement, organize, and lead Data Jams. Participants will flesh out options for the integration of qualitative data analysis software into their own research flows, methods curricula, and the broader institutional contexts of their work. We will outline what needs to be in place to plan and continuously offer these collaborative interdisciplinary analysis sessions. Participants must bring laptops with MAXQDA (demo) installed.


PDC22: Research Toward Equity and Justice: The Gordon Paradigm of Inquiry and Practice
Instructors: Ezekiel J. Dixon-Román, The University of Pennsylvania; Louis M. Gomez, University of California – Los Angeles; Stafford Hood, University of Illinois at Urbana-Champaign
Date: Saturday, April 29, 2017
Time: 8:00am – 12:00pm
Fee: $55.00
Location: San Antonio Marriott Riverwalk, Second Floor, Ballroom Level, Alamo Ballroom Salon D

This professional development course introduces both the career work and contributions of Edmund W. Gordon as well as what we have come to call the Gordon Paradigm of Inquiry and Practice. The Gordon Paradigm of Inquiry and Practice represents a shift in research on the education and human development of marginalized groups and the socially constituted “other” or “different.” This has been a paradigmatic revolution away from scholarship that has understood difference to be based on biological determinism, social deficiency, or culture of poverty to a rather Marxist educational and cultural psychological lens on human learning and development of all persons. The Gordon Paradigm has not adhered to disciplinary boundaries but instead has been purposefully transdisciplinary, theoretically promiscuous, and methodologically agnostic. In other words, The Gordon Paradigm has been marked by a particular type of intellectual hunger, habits of mind, and scholarly orientation that is a perspectivist posture toward knowledge production for social understanding, with the paramount concern for trying to better the lives of the marginalized.


PDC23: Building Researchers’ Capacity to Partner with Practitioners to Conduct Relevant and Useful Research
Instructors: Julie R. Kochanek, American Institutes for Research; Shazia R. Miller, American Institutes for Research
Date: Sunday, April 30, 2017
Time: 8:00am – 12:00pm
Fee: $55.00
Location: San Antonio Marriott Riverwalk, Second Floor, Ballroom Level, Alamo Ballroom Salon A

The learning objectives for early career scholars and advanced researchers participating in this course are to develop a deeper understanding of the various types of research-practice partnerships’ features, developmental stages, and indicators of success; gain insight into the types of roles individuals play in a partnership and the process for developing a partnership; construct a theory of action for a research-practice partnership; describe the steps needed to negotiate a research agenda; and explain processes and strategies used to establish and maintain a research-practice partnership’ s momentum. The instructors will use a combination of brief presentations and hands-on exercises to build the participants’ capacity to partner with practitioners to conduct relevant research. Hard copies of all of the materials needed for the course will be provided by the instructors with the exception of the pre-course required reading materials.


PDC24: The Stanford Education Data Archive: Using Big Data to Study Academic Performance
Instructors: Sean F. Reardon, Stanford University; Andrew Dean Ho, Harvard University; Benjamin R. Shear, University of Colorado - Boulder; Erin Michelle Fahle, Stanford University
Date: Sunday, April 30, 2017
Time: 8:00am – 12:00pm
Fee: $95.00
Location: San Antonio Marriott Riverwalk, Second Floor, Ballroom Level, Alamo Ballroom Salon C

The Stanford Education Data Archive (SEDA) is a new publicly-available dataset based on roughly 300 million standardized test scores taken by students in U.S. public schools from 2009 to 2015. SEDA currently contains average test scores by grade (grades 3-8), year (2009-2015), test subject (math and ELA), and subgroup (gender, race/ethnicity, and free lunch eligibility) for all school districts in the U.S. The test scores from different states, grades, and years are aligned to a common national scale, allowing comparisons of student performance across place and time. This course will provide a detailed description of what SEDA contains and how the dataset was constructed. It will include a description of how the test scores are linked to a common scale, a discussion of the sources and magnitude of uncertainty in the estimates, and a guide for interpretation of the estimates and how they can be appropriately used in both descriptive and causal research.


PDC25: Using Multilevel Modeling to Meta-Analyze Single-Case Experimental Design Studies’ Results
Instructors: Wim Van den Noortgate, Katholieke Universiteit Leuven; Susan Natasha Beretvas, The University of Texas at Austin; John M. Ferron, University of South Florida; Mariola Moeyaert, University at Albany - SUNY
Date: Sunday, April 30, 2017
Time: 8:00am – 12:00pm
Fee: $95.00
Location: San Antonio Marriott Riverwalk, Second Floor, Ballroom Level, Alamo Ballroom Salon B

This course introduces graduate students and researchers to the use of multilevel models for meta-analyzing single-subject experimental design (SSED) studies. It demonstrates how to complement visual analysis inferences with multilevel model analysis results. The course will begin with an overview of multilevel modeling and how the model can be used for SSED meta-analysis of unstandardized data. Next, an overview of how to standardize raw data or effect sizes will be given, and the meta-analysis of these standardized data will be demonstrated. Multiple, real-world examples will be provided as well as opportunities for guided practice for participants in the form of a series of hands-on activities. Copies of commented SAS, R and HLM code and output for every analysis will be provided, so participants do not have to bring computers. Participants need only be familiar with multiple regression analysis and the principles of statistical inference.


PDC26: Writing an Application for an IES Grant
Instructors: Allen Ruby, Institute of Education Sciences; Meredith J. Larson, National Center for Education Research
Date: Sunday, April 30, 2017
Time: 8:00am – 12:00pm
Fee: No Fee. This course is by application only. Click here for the application. Deadline extended: March 5, 2017.
Location: San Antonio Marriott Riverwalk, Second Floor, Ballroom Level, Alamo Ballroom Salon E

This course will provide instruction on writing a successful grant application to the Institute of Education Sciences (IES) Education Research Grants Program (84.305A) and Special Education Research Grants Program (84.324A). The course will focus on: 1) the research topics; 2) the research goal structure; and 3) the four sections of the Research Narrative (Significance, Research Plan, Personnel, and Resources) that comprise the most important part of the IES grant application. If there is also a demand for information on the IES grant programs that support partnerships between researchers and practitioners (84.305H, 84.305L, and 84.324L) the course will also address these grant programs including their fifth section of the Research Narrative (Partnership). Course leaders will introduce specific concepts and examples of strategies for writing key sections of the Project Narrative (e.g., introduction to the proposal, intervention description, theory of change, research design, analysis, and personnel), as well as examples of common errors and how to avoid them. Participants will be asked to submit initial drafts of some of these key sections and will work in small groups to revise them based on course leader and participant feedback. This course is by application only. Click here for the application.


PDC27: Storing, Managing, Coding, and Sharing Video Data with Databrary
Instructors: Rick O. Gilmore, The Pennsylvania State University; Kasey C. Soska, Databrary
Date: Sunday, April 30, 2017
Time: 1:00pm – 5:00pm
Fee: $95.00    
Location: San Antonio Marriott Riverwalk, Second Floor, Ballroom Level, Alamo Ballroom Salon E

This course is for scholars and students interested in access to and analysis of video data. Video captures much of the richness and complexity of behavior in formal and informal learning settings. The Databrary project empowers researchers by providing the software, cyberinfrastructure, and policy framework for securely storing, managing, coding, and sharing video data. Funded by NSF and NIH, the Databrary is a rich data archive with current video data sets and an open source desktop coding tool (see Datavyu.org). This course will teach researchers—from advanced to emerging scholars—how to exploit the richness inherent in recordings of behavior. Instructors will share best practices for coding behavioral data and show how to use the Datavyu coding tool to mine video data. Instructors will explain Databrary’s policy framework that enables sharing of video data among researchers, while addressing concerns about participants’ privacy. Course participants learn how to use Databrary as a lab server and a secure backup for large quantities of data how to use Databrary as a resource for teaching and research. There are no prerequisite skills required to attend the workshop beyond an interest in or experience with using video data. Participants can bring laptops, but these are not required for attendance.