
Please use this identifier to cite or link to this item:
https://dspace.lboro.ac.uk/2134/24650

Title:  A new Bayesian test to test for the intractabilitycountering hypothesis 
Authors:  Chakrabarty, Dalia 
Keywords:  Bayes factors Hypothesis testing Markov Chain Monte Carlo Bayesian PValues 
Issue Date:  2016 
Publisher:  Taylor & Francis © American Statistical Association 
Citation:  CHAKRABARTY, D., 2016. A new Bayesian test to test for the intractabilitycountering hypothesis. Journal of the American Statistical Association, 112 (518), pp. 561577. 
Abstract:  We present a new test of hypothesis in which we seek the probability of the null conditioned on the data, where the null is a simplification undertaken to counter the intractability of the more complex model, that the simpler null model is nested within. With the more complex model rendered intractable, the null model uses a simplifying assumption that capacitates the learning of an unknown parameter vector given the data. Bayes factors are shown to be known only up to a ratio of unknown datadependent constants–a problem that cannot be cured using prescriptions similar to those suggested to solve the problem caused to Bayes factor computation, by noninformative priors. Thus, a new test is needed in which we can circumvent Bayes factor computation. In this test, we undertake generation of data from the model in which the null hypothesis is true and can achieve support in the measured data for the null by comparing the marginalised posterior of the model parameter given the measured data, to that given such generated data. However, such a ratio of marginalised posteriors can confound interpretation of comparison of support in one measured data for a null, with that in another data set for a different null. Given an application in which such comparison is undertaken, we alternatively define support in a measured data set for a null by identifying the model parameters that are less consistent with the measured data than is minimally possible given the generated data, and realising that the higher the number of such parameter values, less is the support in the measured data for the null. Then, the probability of the null conditional on the data is given within an MCMCbased scheme, by marginalising the posterior given the measured data, over parameter values that are as, or more consistent with the measured data, than with the generated data. In the aforementioned application, we test the hypothesis that a galactic state space bears an isotropic geometry, where the (missing) data comprising measurements of some components of the state space vector of a sample of observed galactic particles, is implemented to Bayesianly learn the gravitational mass density of all matter in the galaxy. In lieu of an assumption about the state space being isotropic, the likelihood of the sought gravitational mass density given the data, is intractable. For a real example galaxy, we find unequal values of the probability of the null–that the host state space is isotropic–given two different data sets, implying that in this galaxy, the system state space constitutes at least two disjoint subvolumes that the two data sets respectively live in. Implementation on simulated galactic data is also undertaken, as is an empirical illustration on the wellknown Oring data, to test for the form of the thermal variation of the failure probability of the Orings. 
Description:  This is an Accepted Manuscript of an article published by Taylor & Francis in Journal of the American Statistical Association on 07 Oct 2016, available online: http://dx.doi.org/10.1080/01621459.2016.1240684 
Version:  Accepted for publication 
DOI:  10.1080/01621459.2016.1240684 
URI:  https://dspace.lboro.ac.uk/2134/24650 
Publisher Link:  http://dx.doi.org/10.1080/01621459.2016.1240684 
ISSN:  01621459 
Appears in Collections:  Published Articles (Maths)

Files associated with this item:

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.
