Preprint Article Version 1 This version is not peer-reviewed

"Bring More Data!" – A Good Advice? Removing Separation in Logistic Regression by Increasing Sample Size

Version 1 : Received: 25 October 2019 / Approved: 28 October 2019 / Online: 28 October 2019 (12:01:17 CET)

How to cite: Šinkovec, H.; Geroldinger, A.; Heinze, G. "Bring More Data!" – A Good Advice? Removing Separation in Logistic Regression by Increasing Sample Size. Preprints 2019, 2019100321 (doi: 10.20944/preprints201910.0321.v1). Šinkovec, H.; Geroldinger, A.; Heinze, G. "Bring More Data!" – A Good Advice? Removing Separation in Logistic Regression by Increasing Sample Size. Preprints 2019, 2019100321 (doi: 10.20944/preprints201910.0321.v1).

Abstract

The parameters of logistic regression models are usually obtained by the method of maximum likelihood (ML). However, in analyses of small data sets or data sets with unbalanced outcomes or exposures, ML parameter estimates may not exist. This situation has been termed “separation” as the two outcome groups are separated by the values of a covariate or a linear combination of covariates. To overcome the problem of non-existing ML parameter estimates, applying Firth’s correction (FC) was proposed. In practice, however, a principal investigator might be advised to “bring more data” in order to solve a separation issue. We illustrate the problem by means of an examples from colorectal cancer screening and ornithology. It is unclear if such an increasing sample size (ISS) strategy that keeps sampling new observations until separation is removed improves estimation compared to applying FC to the original data set. We performed an extensive simulation study where the main focus was to estimate the cost-adjusted relative efficiency of ML combined with ISS compared to FC. FC yielded reasonably small root mean squared errors and proved to be the more efficient estimator. Given our findings, we propose not to adapt the sample size when separation is encountered but to use FC as the default method of analysis whenever the number of observations or outcome events is critically low.

Subject Areas

maximum likelihood; logistic regression; firth's correction; separation; penalized likelihood; bias

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our diversity statement.

Leave a public comment
Send a private comment to the author(s)
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.