# RKL: a general, invariant Bayes solution for Neyman-Scott

@article{Brand2017RKLAG, title={RKL: a general, invariant Bayes solution for Neyman-Scott}, author={M. Brand}, journal={arXiv: Machine Learning}, year={2017} }

Neyman-Scott is a classic example of an estimation problem with a partially-consistent posterior, for which standard estimation methods tend to produce inconsistent results. Past attempts to create consistent estimators for Neyman-Scott have led to ad-hoc solutions, to estimators that do not satisfy representation invariance, to restrictions over the choice of prior and more. We present a simple construction for a general-purpose Bayes estimator, invariant to representation, which satisfies… Expand

#### References

SHOWING 1-10 OF 28 REFERENCES

On Some Bayesian Solutions of the Neyman-Scott Problem

- Mathematics
- 1994

One of the two celebrated examples of Neyman and Scott (1948) is that in a fixed effects one-way analysis of variance model with normal homoscedastic errors, the maximum likelihood estimator of the… Expand

Noninformative priors for the two sample normal problem

- Mathematics
- 1996

SummaryThe paper considers two examples in the two sample normal problem and finds noniformative priors which satisfy (i) a criterion of matching asymptotically the posterior distribution function of… Expand

Efficiency of projected score methods in rectangular array asymptotics

- Mathematics
- 2003

The paper considers a rectangular array asymptotic embedding for multistratum data sets, in which both the number of strata and the number of within-stratum replications increase, and at the same… Expand

An invariant form for the prior probability in estimation problems

- Mathematics, Medicine
- Proceedings of the Royal Society of London. Series A. Mathematical and Physical Sciences
- 1946

It is shown that a certain differential form depending on the values of the parameters in a law of chance is invariant for all transformations of the parameters when the law is differentiable with… Expand

A Review of Consistency and Convergence of Posterior Distribution

In this article, we review two important issues, namely consistency and convergence of posterior distribution, that arise in Bayesian inference with large samples. Both parametric and non-parametric… Expand

A Catalog of Noninformative Priors

- Computer Science
- 1996

A catalog of many of the resulting priors is provided and known properties of the priors are list and emphasis is given to reference priors and the Je reys prior although other approaches are also considered. Expand

On Divergences and Informations in Statistics and Information Theory

- Mathematics, Computer Science
- IEEE Transactions on Information Theory
- 2006

The paper deals with the f-divergences of Csiszar generalizing the discrimination information of Kullback, the total variation distance, the Hellinger divergence, and the Pearson divergence. All… Expand

The Estimation of Distributions and the Minimum Relative Entropy Principle

- Mathematics, Computer Science
- Evol. Comput.
- 2005

The relationship of EDA to algorithms developed in statistics, artificial intelligence, and statistical physics is explained within a general interdisciplinary framework and it is shown that maximum entropy approximations play a crucial role. Expand

Semilinear High-Dimensional Model for Normalization of Microarray Data

- Mathematics
- 2005

Normalization of microarray data is essential for removing experimental biases and revealing meaningful biological results. Motivated by a problem of normalizing microarray data, a semilinear… Expand

Point Estimation Using the Kullback-Leibler Loss Function and MML

- Mathematics, Computer Science
- PAKDD
- 1998

An argument is presented as to why the SMML and MML estimators are invariant under parameter transformations, and an approximation to SMML called Fairly Strict MML (FSMML) maps regions from the parameter space to point estimates. Expand