As a supplement to summary statistics of information criteria, the closeness of two or more competing non-nested models can be compared under a procedure that is more general than that proposed in Vuong (1989); measures of closeness other... more
As a supplement to summary statistics of information criteria, the closeness of two or more competing non-nested models can be compared under a procedure that is more general than that proposed in Vuong (1989); measures of closeness other than the Kullback-Leibler divergence are allowed. Large deviation theory is used to obtain a bound of the power of rejecting the null hypothesis that the two models are equally close to the true model. Such a bound can be expressed in terms of a constant \gamma \in [0, 1); can be computed empirically without any knowledge of the data generating mechanism. Additionally, based on the constant \gamma, the procedures constructed based on different measures of distance can be compared on their abilities to conclude a difference between two models.