Logistic regression: Analyzing binary outcomes
March 25, 2019 @ 12:00 pm - 4:00 pm$175
A 4-hour workshop taught by Stephen R. Porter, Ph.D.Register
Many outcomes in education are binary in nature: accept or decline an offer of admission, pass or fail a course, persist to another year or stop out. Logistic regression, rather than multiple regression, is the standard approach to analyzing discrete outcomes. This workshop will train participants in applying logistic regression to their research, focusing on 1) the parallels with multiple regression, and 2) how to interpret model results for a wide audience.
By the end of the workshop, participants should understand logistic regression well enough to begin using it in their research. They will understand when to use logistic regression, how to interpret logistic regression coefficients, and how to calculate and discuss model fit.
Pricing and schedule
Time: Monday, March 25, 12PM to 4PM (EST)
We offer $25 graduate student and multiple workshop discounts. Find out about our discounts here.
Time permitting, Dr. Porter will also answer questions about participants’ specific research projects. Participants can ask questions via chat, microphone, or telephone. In order to allow sufficient time for questions, the number of workshop participants is limited to 30.
Who should attend?
The target audience is researchers who are familiar with multiple regression but are not familiar with logistic regression, and wish to begin using it in their research (or those researchers looking for a quick refresher). This is an applied course, so no advanced math skills are required; however, you should understand how to interpret a multiple regression coefficient. Software demonstrations will use Stata, but output from SAS and SPSS will be included and reviewed so that participants can understand and interpret logistic regression models estimated using these software.
If you are interested in propensity score analysis, this is an excellent workshop to attend prior to our matching workshop.
- Why logistic regression is preferred over multiple regression (the linear probability model)
- How logistic regression estimates coefficients (maximum likelihood), and the problem this poses for interpretation
- Similarities with multiple regression: most of what you know can be applied directly to logistic regression
- Predicted probabilities versus Y-hat from multiple regression
- Interpreting results using odds ratios: what they are and why you don’t want to use them
- Interpreting results using discrete changes in probability (delta-p statistic)
- Different ways of measuring model fit (pseudo R-squared, percent correctly predicted)