GU6008 Experimental Methods for Social Research
Spring 2024 | 10:10-12:00pm Thursdays
Room: 501D Knox Hall
Office hours: 1-2:30pm Tuesdays, please sign up here
Instructor Email: jyc2163@columbia.edu
Course Summary
What is this course about?
An experiment is a data collection strategy that involves randomization and control. When properly designed, experiments enable unbiased estimation of causal effects in a sample. The goal of this course is to introduce you to the logic and practice of experiments. This course has two parts. The first part is an introduction to experiments and their assumptions. You will build upon this knowledge in the second part of the course to design an experiment on a topic of your choice.
Why learn experimental methods for social research?
Experimental design is a useful framework for thinking through social scientific problems. You will learn how to conduct your own experiment, identify its strengths and weaknesses, and analyze results thoughtfully. These skills train you to formalize what you are trying to identify (estimand) and how you plan to do so. These skills will assist in helping you think more clearly about social scientific problems more generally.
Experiments are often deployed across the social sciences. You will learn how to read, understand, and critique experiments. You will also learn the limitations of experimental research, to better motivate other methodological choices.
Experimental methods are a foundation for learning other causal inference methods. We will not be covering quasi-experimental designs or other methods of causal inference in this course, but a firm grasp of experimental methods is often necessary to understand them.
What will you be able to know and do by the end of the course?
- Identify the logic of causality assumed in experiments.
- Evaluate the strengths and weaknesses of experimental research.
- Diagnose and address threats to internal validity in experiments.
- Analyze the results of an experiment.
- Recognize the limitations of experiments and be able to thoughtfully critique them.
- Design an experiment that you could submit for funding and implement.
What kind of student does this course have in mind?
This course is limited to doctoral students in Sociology and related disciplines. I will assume that you have taken at least an introductory statistics course (through linear regression). For a refresher, you may be interested in the following books:
π Angrist, Joshua, and Steffan Jorg Pischke. 2009. Mostly Harmless Econometrics. Princeton: Princeton University Press.
π Morgan, Stephen L., and Christopher Winship. 2014. Counterfactuals and Causal Inference: Methods and Principles for Social Research, Second Edition. Cambridge, UK: Cambridge University Press.
How will performance be assessed?
Assignment | Description | Deadlines |
---|---|---|
Short writing assignments (30%) | There are seven short assignments that build toward the final experiment proposal. Details about the assignments are provided along with the readings below. Assignments are graded for completeness. | Due throughout the semester via upload to Canvas by 11:59 pm the day before our class meets. |
Analysis paper (10%) | Based on data from an experiment, I will ask you to answer a specific set of questions. More details will be provided in class. | Due via upload to Canvas the day before class meets on April 4. |
Experiment proposal (40%) | The final assignment of the course is to develop a proposal for an experiment, ideally one that can be submitted for funding. If you are proposing a survey experiment, I expect final assignments to be modelled after a TESS proposal ( http://tessexperiments.org/). For other types of experiments, you should submit a pre-analysis plan or pre-registration, e.g. Open Science Framework. If other formats are preferred, let me know by email as soon as possible. | Due via upload to Canvas by 11:59pm on Sunday May 5. |
In class participation (20%) | A key component of this seminar will be in-class student participation. Participation refers to informed contributions to the class. You should be ready to discuss your short assignments in class. In addition, you will be asked to prepare a final presentation of your experimental proposal. |
Please submit all assignments in 12-point type size, double-spaced, with one-inch margins. These are disciplinary standards and using double-space allows me to give better feedback. All assignments should be uploaded to Canvas.
What are the basic norms in this course?
- If you need disability-related accommodations, let me know as soon as possible. You have the right to have your needs met. If you need accommodations, you should be registered with the Office of Disability Services (ODS) in 008 Milbank (212-854-2388, disability@columbia.edu).
- Life happens. If you submit your work after the deadlines listed above, you will still qualify for half of the original points.
- Avoid using cell phones in class, which can prevent you or others from learning. In cases of emergencies, please take your phone outside.
- I try to respond to emails within 24 hours. You are welcome to follow-up if I have not responded by then.
What are expectations regarding academic integrity?
Graduate students are expected to exhibit the highest level of personal and academic honesty as they engage in scholarly discourse and research. In practical terms, you must be responsible for the full and accurate attribution of the ideas of others in all of your research papers and projects; you must be honest when taking your examinations; you must always submit your own work and not that of another student, scholar, or internet source. Graduate students are responsible for knowing and correctly utilizing bibliographical guidelines.
Where can I access course materials?
The readings for each week of this course are listed below. There is one required textbook for this course:
π Gerber, Alan S., and Donald P. Green. 2012. Field Experiments: Design, Analysis, and Interpretation. New York: W.W. Norton.
Other than the required textbook, all other readings are available online via the below links. Where linked versions of a reading are unavailable, I have uploaded them to Coursework.
Readings and Due Dates for Assignments
Part I: Understanding Experiments
Week 1: Why Experiment? (Jan 18)
-
Gerber, Alan S., and Donald P. Green. 2012. Field Experiments: Design, Analysis, and Interpretation. New York: W.W. Norton.
- Ch. 1, “Introduction”
-
Jackson, Michelle, and D. R. Cox. 2013. “The Principles of Experimental Design and Their Application in Sociology.” Annual Review of Sociology 39:27–49.
-
Pearl, Judea and Dana Mackenzie. 2018. The Book of Why: The New Science of Cause and Effect. New York: Basic Books. Chapter 1.
Week 2: The Logic and Assumptions of Experiments (Jan 25)
WE WILL BE MEETING IN ROOM 509 THIS WEEK
β Due: Submit at most 2 paragraphs summarizing an experiment that you want to develop in this course. At minimum, your summary should include a research question, why the question is important, and a rough sketch of how you plan to answer the question.
-
Druckman, James N. et al. 2011. “Ch2. Experiments: An Introduction to Core Concepts “ Cambridge Handbook of Experimental Political Science.
-
Gerber, Alan S., and Donald P. Green. 2012. Field Experiments: Design, Analysis, and Interpretation. 2nd ed. New York: W.W. Norton.
- Ch. 2, “Causal Inference and Experimentation”
Week 3: Types of Experiments Pt I (Feb 1)
-
Lab Experiments
-
Willer, Robb. 2009. “Groups Reward Individual Sacrifice: The Status Solution to the Collective Action Problem.” American Sociological Review 74(1):23-43.
-
Ridgeway, Cecilia L., and Shelley J. Correll. 2006. “Consensus and the Creation of Status Beliefs.” Social Forces 85(1): 431-453.
-
-
Survey Experiments
-
Schachter, Ariela. 2016. “From ‘Different’ to ‘Similar’: An Experimental Approach to Understanding Assimilation.” American Sociological Review 81(5):981-1013.
-
Phelan, Jo C., Bruce G. Link, and Naumi M. Feldman. 2013. “The Genomic Revolution and Beliefs about Essential Racial Differences: A Backdoor to Eugenics?” American Sociological Review 78(2):167-191.
-
-
Audit Experiments
-
Pager, Devah. 2003. “The Mark of a Criminal Record.” American Journal of Sociology 108(5):937-975. doi: 10.1086/374403.
-
Correll, Shelley J., Stephen Benard, and In Paik. 2007. “Getting a Job: Is There a Motherhood Penalty?” American Journal of Sociology 112(5):1297-1339.
-
Week 4: Types of Experiments Pt II (Feb 8)
β Due: Summarizing experiments memo (see handout)
-
Web Experiments
-
Salganik, Matthew J., Peter Sheridan Dodds, and Duncan J. Watts. 2006. “Experimental Study of Inequality and Unpredictability in an Artificial Cultural Market.” Science 311(854):854-856.
-
Kramer, A. D., Guillory, J. E., & Hancock, J. T. (2014). “Experimental evidence of massive-scale emotional contagion through social networks.” Proceedings of the National Academy of Sciences, 111(24), 8788-8790.
-
-
Field Experiments
-
Paluck, Elizabeth Levy, Hana Shepherd, and Peter M. Aronow. 2016. “Changing Climates of Conflict: A Social Network Experiment in 56 Schools.” Proceedings of the National Academy of Sciences 113(3):556-571.
-
Broockman, David, and Joshua Kalla. 2016. “Durably Reducing Transphobia: A Field Experiment on Door-to-Door Canvassing.” Science 352(6282):220-224.
-
-
Lab in the Field
-
Gneezy, Uri, and Imas, Alex. 2017. “Lab in the Field: Measuring Preferences in the Wild.” Pp. 439-464 in Handbook of Economic Field Experiments, edited by A. Vinayak Banerjee and E. Duflo. North-Holland.
-
Baldassarri, Delia. 2015. “Cooperative networks: Altruism, group solidarity, reciprocity, and sanctioning in Ugandan producer organizations.” American Journal of Sociology 121(2): 355-395.
-
Week 5: Hallmarks of Valid and Publishable Social Science Experiments (Feb 15)
β Due: Write a title and abstract for a paper you imagine writing based on your proposed experiment. Assume that your findings align with your theoretical predictions. Remember to establish why the findings matter for your intended audience.
-
Druckman, James N. 2021. Experimental Thinking: A Primer on Social Science Experiments. New York: Cambridge University Press.
-
Ch. 3 “Evaluating Experiments: Realism, Validity, and Samples”
-
Ch. 6 “Designing “Good” Experiments”
-
-
CW McDermott, Rose. 2011. “Chapter 3. Internal and External Validity.” in Druckman, James N ed. Cambridge Handbook of Experimental Political Science. Cambridge: Cambridge University Press.
Part II: Designing Experiments
Week 6: Treatment Assignment (Feb 22)
β Due: Outline of your proposed experiment (see handout).
-
Gerber, Alan S., and Donald P. Green. 2012. “Field Experiments: Design, Analysis, and Interpretation.” 1st ed. New York: W.W. Norton.
- Ch. 3, “Sampling Distributions, Statistical Inference, and Hypothesis Testing”
-
Familiarize yourself with different types of experimental designs in the following two links:
-
Charness, Gary, Uri Gneezy, and Michael A. Kuhn. 2012. “Experimental Methods: Between- Subject and Within-Subject Design.” Journal of Economic Behavior & Organization 81:1-8.
Week 7: The Causal Chain / Mediation and Moderation (Feb 29)
β Due: Revised outline, now including a new section called “Mediators and Moderators.” In this section, describe at least one theorized mediator and moderator and how you plan to measure them. Include a proposed causal diagram that relates all independent and dependent variables, and moderators and mediators.
-
Baron, Reuben M., and David A. Kenny. 1986. “The Moderator-Mediator Variable Distinction in Social Psychological Research: Conceptual, Strategic, and Statistical Considerations.” Journal of Personality and Social Psychology. 51(6):1173-1182.
-
Gerber, Alan S., and Donald P. Green. 2012. “Field Experiments: Design, Analysis, and Interpretation.” 1st ed. New York: W.W. Norton.
- Ch. 10, “Mediation”
-
Imai, Kosuke, Luke Keele, Dustin Tingley, and Teppei Yamamoto. 2011. “Unpacking the Black Box of Causality: Learning about Causal Mechanisms from Experimental and Observational Studies.” American Political Science Review 105(4):765–89. doi: 10.1017/S0003055411000414.
-
Kane, John V., and Jason Barabas. 2019. “No Harm in Checking: Using Factual Manipulation Checks to Assess Attentiveness in Experiments.” American Journal of Political Science 63(1):234–49.
Week 8: False Positives and Negatives / Statistical Power (March 7)
-
Gerber, Alan S., and Donald P. Green. 2012. “Field Experiments: Design, Analysis, and Interpretation.” 1st ed. New York: W.W. Norton.
- Ch. 4, “Using Covariates”
-
Camerer, Colin F., Anna Dreber, Felix Holzmeister, Teck-Hua Ho, JΓΌrgen Huber, Magnus Johannesson, Michael Kirchler et al. 2018. “Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015.” Nature Human Behaviour 2(9): 637-644.
-
Nosek, Brian A., Charles R. Ebersole, Alexander C. DeHaven, and David T. Mellor. “The preregistration revolution.” Proceedings of the National Academy of Sciences 115, no. 11 (2018): 2600-2606.
NO CLASS March 14 (SPRING BREAK)
Week 9: Threats to Internal Validity (March 21)
β Due: OPTIONAL Revised outline, now including a new section titled “Potential Threats.” In this section, diagnose threats and briefly describe potential countermeasures. You may consider discussing false positives, statistical power, demand effects, noncompliance, spillover, and/or attrition.
-
Mummolo, Jonathan, and Erik Peterson. 2019. “Demand Effects in Survey Experiments: An Empirical Assessment.” American Political Science Review 113(2):517–29. doi: 10.1017/S0003055418000837.
-
Gerber, Alan S., and Donald P. Green. 2012. “Field Experiments: Design, Analysis, and Interpretation.” 1st ed. New York: W.W. Norton.
-
Ch. 5, “One-Sided Noncompliance”
-
Ch. 7, “Attrition”
-
Ch. 8, “Interference between Experimental Units”
-
Week 10: Ethics (March 28)
β Due : Ethics Memo (see handout).
-
Zimbardo, Phillip et al. 1971. The Stanford Prison Experiment: conducted August 1971 at Stanford University.
-
McDermott, Rose, and Peter K. Hatemi. 2020. “Ethics in field experimentation: A call to establish new standards to protect the public from unwanted manipulation and real harms.” Proceedings of the National Academy of Sciences 117(48): 30014-30021.
-
Matias, J. N. 2016, December 12. “The Obligation to Experiment.” MIT Media Lab.
Week 11: Identifying and Interpreting Treatment Effects / Analysis of Data (April 4)
β Due: Analysis memo (see handout).
-
James Chu, Guirong Li, Prashant Loyalka, Chengfang Liu, Leonardo Rosa, Yanyan Li. 2020. “Stuck in Place? A Field Experiment on the Effects of Reputational Information on Student Evaluations.” Social Forces 98(4): 1578-1612.
- You can also skim the pre-analysis plan
-
CW Mutz, Diana. 2011. Population-Based Survey Experiments. Princeton: Princeton University Press.
- Ch. 7, “Analysis of Population-Based Survey Experiments”
-
Clifford, Scott, Geoffrey Sheagley, and Spencer Piston. 2021. “Increasing Precision without Altering Treatment Effects: Repeated Measures Designs in Survey Experiments.” American Political Science Review 1–18.
Week 12: Lightning Talks (April 11)
Presentations are scheduled before the final week to allow time to integrate feedback to your proposal.
Week 13: Lightning Talks (April 18)
Presentations are scheduled before the final week to allow time to integrate feedback to your proposal.
Week 14: New Advances and Critiques (April 25)
-
Deaton, A., Cartwright, N. 2018. “Understanding and misunderstanding randomized controlled trials.” Social Science & Medicine 210: 2–21.
-
Gelman, Andrew. 2018. “Benefits and limitations of randomized controlled trials: A commentary on Deaton and Cartwright.” Social Science & Medicine 210: 48–49.
-
Al-Ubaydli, Amar, John A. List, and Dana L. Suskind. 2017. “What Can We Learn from Experiments? Understanding the Threats to the Scalability of Experimental Results.” American Economic Review: Paper and Proceedings. 107(5): 282–286.
-
Paluck, Elizabeth Levy. 2010. “The Promising Integration of Qualitative Methods and Field Experiments.” The ANNALS 628(1):59–71.
-
Wager, Stefan, and Susan Athey. 2018. “Estimation and Inference of Heterogeneous Treatment Effects Using Random Forests.” Journal of the American Statistical Association 113(523):1228–42. doi: 10.1080/01621459.2017.1319839.
-
Levine AS. 2020. “Research Impact Through Matchmaking (RITM): How and Why to Connect Researchers and Practitioners.” PS: Political Science & Politics 53: 265–269.
β Final Proposals Due (11:59pm on Sunday, May 5)