Extending Bayesian induction

Suyog H. Chandramouli, Richard M. Shiffrin

Research output: Contribution to journalArticlepeer-review

8 Scopus citations

Abstract

This article comments on "Harold Jeffreys's Default Bayes Factor Hypothesis Tests: Explanation, Extension, and Application in Psychology" by Ly, Verhagen and Wagenmakers (2016). Their article represents an excellent summary of the seminal contributions of Harold Jeffreys to Bayesian induction. We comment on a method to extend Bayesian induction that places the emphasis on data rather than models. Models are always wrong, acting as approximations to the data from which they derive and thereby explaining some of the main factors operating in the experiment. Our simple extension places priors and posteriors on the possible distributions of data outcomes, one of which represents the true state of the world-the observed data is a sample from that unknown true state. The proposed system infers the probability that a given data distribution is the true one, based on the observed sample of data; the posterior probabilities that given model instances provide best approximations to the truth can then be obtained directly.

Original languageEnglish (US)
Pages (from-to)38-42
Number of pages5
JournalJournal of Mathematical Psychology
Volume72
DOIs
StatePublished - Jun 1 2016
Externally publishedYes

All Science Journal Classification (ASJC) codes

  • General Psychology
  • Applied Mathematics

Keywords

  • Bayes Factor
  • BMS*
  • Data distributions
  • Model Selection

Fingerprint

Dive into the research topics of 'Extending Bayesian induction'. Together they form a unique fingerprint.

Cite this