This study explores the performance of several two-stage procedures for testing ordinary least-squares (OLS) coefficients under heteroscedasticity. A test of the usual homoscedasticity assumption is carried out in the first stage of the procedure. Subsequently, a test of the regression coefficients is chosen and performed in the second stage. Three recently developed methods for detecting heteroscedasticity are examined. In addition, three heteroscedastic robust tests of OLS coefficients are considered. A major finding is that performing a test of heteroscedasticity prior to applying a heteroscedastic robust test can lead to poor control over Type I errors.