Selected article for: "Try single phrases listed below for"

Author: Segert, Simon; Davis-Stober, Clintin P
Title: A General Approach to Prior Transformation.
  • Cord-id: j35z7qek
  • Document date: 2019_8_1
  • ID: j35z7qek
    Snippet: We present a general method for setting prior distributions in Bayesian models where parameters of interest are re-parameterized via a functional relationship. We generalize the results of Heck and Wagenmakers (2016) by considering the case where the dimension of the auxiliary parameter space does not equal that of the primary parameter space. We present numerical methods for carrying out prior specification for statistical models that do not admit closed-form solutions. Taken together, these re
    Document: We present a general method for setting prior distributions in Bayesian models where parameters of interest are re-parameterized via a functional relationship. We generalize the results of Heck and Wagenmakers (2016) by considering the case where the dimension of the auxiliary parameter space does not equal that of the primary parameter space. We present numerical methods for carrying out prior specification for statistical models that do not admit closed-form solutions. Taken together, these results provide researchers a more complete set of tools for setting prior distributions that could be applied to many cognitive and decision making models. We illustrate our approach by reanalyzing data under the Selective Integration model of Tsetsos et al. (2016). We find, via a Bayes factor analysis, that the selective integration model with all four parameters generally outperforms both the three-parameter variant (omitting early cognitive noise) and the w = 1 variant (omitting selective gating), as well as an unconstrained competitor model. By contrast, Tsetsos et al. found the three parameter variant to be the best performing in a BIC analysis (in the absence of a competitor). Finally, we also include a pedagogical treatment of the mathematical tools necessary to formulate our results, including a simple "toy" example that illustrates our more general points.

    Search related documents:
    Co phrase search for related documents
    • Try single phrases listed below for: 1
    Co phrase search for related documents, hyperlinks ordered by date