Uniwersytet Warszawski - Centralny System Uwierzytelniania
Strona główna

Experimental design in Social Research

Informacje ogólne

Kod przedmiotu: 2500-PL-PS-SP15-07
Kod Erasmus / ISCED: 14.4 Kod klasyfikacyjny przedmiotu składa się z trzech do pięciu cyfr, przy czym trzy pierwsze oznaczają klasyfikację dziedziny wg. Listy kodów dziedzin obowiązującej w programie Socrates/Erasmus, czwarta (dotąd na ogół 0) – ewentualne uszczegółowienie informacji o dyscyplinie, piąta – stopień zaawansowania przedmiotu ustalony na podstawie roku studiów, dla którego przedmiot jest przeznaczony. / (0313) Psychologia Kod ISCED - Międzynarodowa Standardowa Klasyfikacja Kształcenia (International Standard Classification of Education) została opracowana przez UNESCO.
Nazwa przedmiotu: Experimental design in Social Research
Jednostka: Wydział Psychologii
Grupy:
Punkty ECTS i inne: 2.00 Podstawowe informacje o zasadach przyporządkowania punktów ECTS:
  • roczny wymiar godzinowy nakładu pracy studenta konieczny do osiągnięcia zakładanych efektów uczenia się dla danego etapu studiów wynosi 1500-1800 h, co odpowiada 60 ECTS;
  • tygodniowy wymiar godzinowy nakładu pracy studenta wynosi 45 h;
  • 1 punkt ECTS odpowiada 25-30 godzinom pracy studenta potrzebnej do osiągnięcia zakładanych efektów uczenia się;
  • tygodniowy nakład pracy studenta konieczny do osiągnięcia zakładanych efektów uczenia się pozwala uzyskać 1,5 ECTS;
  • nakład pracy potrzebny do zaliczenia przedmiotu, któremu przypisano 3 ECTS, stanowi 10% semestralnego obciążenia studenta.
Język prowadzenia: angielski
Założenia (opisowo):

Kurs dla specjalizacji 315

Skrócony opis: (tylko po angielsku)

Aim of the course is to expand students’ knowledge about theory and practice of planning experiments in social psychology and beyond. Students will learn how to design adequately powered and valid experiments. They will learn about methodological aspects of experimental design such as participants sampling, stimuli sampling, power analysis, accounting for publication bias in interpreting effect size. Students will discuss recent methodological innovations in social sciences, and how to apply them in their own research.

Efekty uczenia się: (tylko po angielsku)

- Develop skills to critically assess the size of the effects reported in experimental research

- Learn how to conduct power analysis and how to select adequate sample size in experimental studies.

- Deepen the knowledge about factors that can improve or reduce power of the experimental design.

- Get acquainted with different methods of improving validity of experiments in social psychology.

- Learn how to clearly present results of experiments in social psychology.

- Develop skills to design high-quality experimental research that is both interesting and reliable.

Zajęcia w cyklu "Semestr zimowy 2023/24" (zakończony)

Okres: 2023-10-01 - 2024-01-28
Wybrany podział planu:
Przejdź do planu
Typ zajęć:
Ćwiczenia, 15 godzin więcej informacji
Koordynatorzy: (brak danych)
Prowadzący grup: Wiktor Soral
Lista studentów: (nie masz dostępu)
Zaliczenie: Przedmiot - Zaliczenie na ocenę
Ćwiczenia - Zaliczenie na ocenę
Pełny opis: (tylko po angielsku)

Aim of the course is to expand students’ knowledge about theory and practice of planning experiments in social psychology and beyond. Students will learn how to design adequately powered and valid experiments. They will learn about methodological aspects of experimental design such as participants sampling, stimuli sampling, power analysis, accounting for publication bias in interpreting effect size. Students will discuss recent methodological innovations in social sciences, and how to apply them in their own research.

After the course, students should be comfortable with selecting the appropriate effect size for their planned research, and based on that, with selecting adequate sample size for their experiment. They should know various ways of improving statistical power of their experiments, as well as potential limitations of these methods. Furthermore, students should better understand obstacles to validity of their research, and how to avoid them. They will also get acquainted with recent methodological topics in psychology, like new ways of visualizing experimental results.

Literatura: (tylko po angielsku)

1. Hello. Do we need experiments to prove causality? Basics of experimental design – recap.

• No readings

2. Effect sizes and planning sample sizes

• Funder, D. C., & Ozer, D. J. (2019). Evaluating effect size in psychological research: Sense and nonsense. Advances in Methods and Practices in Psychological Science, 2(2), 156-168.

• Westfall, J. (2015). PANGEA: Power analysis for general ANOVA designs. Unpublished manuscript. Available at https://jakewestfall.org/publications/pangea.pdf

• Anderson, S. F., Kelley, K., & Maxwell, S. E. (2017). Sample-size planning for more accurate statistical power: A method adjusting sample effect sizes for publication bias and uncertainty. Psychological Science, 28(11), 1547-1562.

3. Various approaches to improving power of experiments

• Kanyongo, G. Y., Brook, G. P., Kyei-Blankson, L., & Gocmen, G. (2007). Reliability and statistical power: How measurement fallibility affects power and required sample sizes for several parametric and nonparametric statistics. Journal of Modern Applied Statistical Methods, 6(1), 81-90.

• Charness, G., Gneezy, U., & Kuhn, M. A. (2012). Experimental methods: Between-subject and within-subject design. Journal of Economic Behavior & Organization, 81(1), 1-8.

• Baker, D. H., Vilidaite, G., Lygo, F. A., Smith, A. K., Flack, T. R., Gouws, A. D., & Andrews, T. J. (2021). Power contours: Optimising sample size and precision in experimental psychology and human neuroscience. Psychological Methods, 26(3), 295–314.

4. Improving validity of experiments

• Kenny, D. A. (2019). Enhancing validity in psychological research. American Psychologist, 74(9), 1018–1028.

• Henrich, J., Heine, S. J., & Norenzayan, A. (2010). The weirdest people in the world? Behavioral and Brain Sciences, 33(2-3), 61-83. (without commentaries)

• Wells, G. L., & Windschitl, P. D. (1999). Stimulus sampling and social psychological experimentation. Personality and Social Psychology Bulletin, 25, 1115–1125.

• Baumeister, R. F., Vohs, K. D., & Funder, D. C. (2007). Psychology as the science of self-reports and finger movements: Whatever happened to actual behavior? Perspectives on psychological science, 2(4), 396-403.

5. Doing research that is both interesting and reliable. Guidelines for better presentation of results.

• Gray, K., & Wegner, D. M. (2013). Six guidelines for interesting research. Perspectives on Psychological Science, 8(5), 549-553.

• Fiedler, K. (2017). What constitutes strong psychological science? The (neglected) role of diagnosticity and a priori theorizing. Perspectives on Psychological Science, 12(1), 46-61.

• Hehman, E., & Xie, S. Y. (2021). Doing Better Data Visualization. Advances in Methods and Practices in Psychological Science, 4(4), 25152459211045334.

Zajęcia w cyklu "Semestr zimowy 2024/25" (jeszcze nie rozpoczęty)

Okres: 2024-10-01 - 2025-01-26
Wybrany podział planu:
Przejdź do planu
Typ zajęć:
Ćwiczenia, 15 godzin więcej informacji
Koordynatorzy: (brak danych)
Prowadzący grup: Wiktor Soral
Lista studentów: (nie masz dostępu)
Zaliczenie: Przedmiot - Zaliczenie na ocenę
Ćwiczenia - Zaliczenie na ocenę
Pełny opis: (tylko po angielsku)

Aim of the course is to expand students’ knowledge about theory and practice of planning experiments in social psychology and beyond. Students will learn how to design adequately powered and valid experiments. They will learn about methodological aspects of experimental design such as participants sampling, stimuli sampling, power analysis, accounting for publication bias in interpreting effect size. Students will discuss recent methodological innovations in social sciences, and how to apply them in their own research.

After the course, students should be comfortable with selecting the appropriate effect size for their planned research, and based on that, with selecting adequate sample size for their experiment. They should know various ways of improving statistical power of their experiments, as well as potential limitations of these methods. Furthermore, students should better understand obstacles to validity of their research, and how to avoid them. They will also get acquainted with recent methodological topics in psychology, like new ways of visualizing experimental results.

Literatura: (tylko po angielsku)

1. Hello. Do we need experiments to prove causality? Basics of experimental design – recap.

• No readings

2. Effect sizes and planning sample sizes

• Funder, D. C., & Ozer, D. J. (2019). Evaluating effect size in psychological research: Sense and nonsense. Advances in Methods and Practices in Psychological Science, 2(2), 156-168.

• Westfall, J. (2015). PANGEA: Power analysis for general ANOVA designs. Unpublished manuscript. Available at https://jakewestfall.org/publications/pangea.pdf

• Anderson, S. F., Kelley, K., & Maxwell, S. E. (2017). Sample-size planning for more accurate statistical power: A method adjusting sample effect sizes for publication bias and uncertainty. Psychological Science, 28(11), 1547-1562.

3. Various approaches to improving power of experiments

• Kanyongo, G. Y., Brook, G. P., Kyei-Blankson, L., & Gocmen, G. (2007). Reliability and statistical power: How measurement fallibility affects power and required sample sizes for several parametric and nonparametric statistics. Journal of Modern Applied Statistical Methods, 6(1), 81-90.

• Charness, G., Gneezy, U., & Kuhn, M. A. (2012). Experimental methods: Between-subject and within-subject design. Journal of Economic Behavior & Organization, 81(1), 1-8.

• Baker, D. H., Vilidaite, G., Lygo, F. A., Smith, A. K., Flack, T. R., Gouws, A. D., & Andrews, T. J. (2021). Power contours: Optimising sample size and precision in experimental psychology and human neuroscience. Psychological Methods, 26(3), 295–314.

4. Improving validity of experiments

• Kenny, D. A. (2019). Enhancing validity in psychological research. American Psychologist, 74(9), 1018–1028.

• Henrich, J., Heine, S. J., & Norenzayan, A. (2010). The weirdest people in the world? Behavioral and Brain Sciences, 33(2-3), 61-83. (without commentaries)

• Wells, G. L., & Windschitl, P. D. (1999). Stimulus sampling and social psychological experimentation. Personality and Social Psychology Bulletin, 25, 1115–1125.

• Baumeister, R. F., Vohs, K. D., & Funder, D. C. (2007). Psychology as the science of self-reports and finger movements: Whatever happened to actual behavior? Perspectives on psychological science, 2(4), 396-403.

5. Doing research that is both interesting and reliable. Guidelines for better presentation of results.

• Gray, K., & Wegner, D. M. (2013). Six guidelines for interesting research. Perspectives on Psychological Science, 8(5), 549-553.

• Fiedler, K. (2017). What constitutes strong psychological science? The (neglected) role of diagnosticity and a priori theorizing. Perspectives on Psychological Science, 12(1), 46-61.

• Hehman, E., & Xie, S. Y. (2021). Doing Better Data Visualization. Advances in Methods and Practices in Psychological Science, 4(4), 25152459211045334.

Opisy przedmiotów w USOS i USOSweb są chronione prawem autorskim.
Właścicielem praw autorskich jest Uniwersytet Warszawski.
Krakowskie Przedmieście 26/28
00-927 Warszawa
tel: +48 22 55 20 000 https://uw.edu.pl/
kontakt deklaracja dostępności USOSweb 7.0.3.0 (2024-03-22)