Open Research Newcastle
Browse

Convergence of best entropy estimates

Download (1.43 MB)
journal contribution
posted on 2025-05-09, 07:56 authored by J. M. Borwein, A. S. Lewis
Given a finite number of moments of an unknown density ̅ x on a finite measure space, the best entropy estimate-that nonnegative density x with the given moments which minimizes the Boltzmann-Shannon entropy I(x):=∫ x log x-is considered. A direct proof is given that I has the Kadec property in L1-if Yn converges weakly to ̅y and I(yn) converges to I( ̅y ), then ynn converges to ̅y in norm. As a corollary, it is obtained that, as the number of given moments increases, the best entropy estimates converge in L1 norm to the best entropy estimate of the limiting problem, which is simply ̅ x in the determined case. Furthermore, for classical moment problems on intervals with ̅ x strictly positive and sufficiently smooth, error bounds and uniform convergence are actually obtained.

History

Journal title

SIAM Journal on Optimization

Volume

1

Issue

2

Pagination

191-205

Publisher

Society for Industrial and Applied Mathematics (SIAM)

Language

  • en, English

College/Research Centre

Faculty of Science and Information Technology

School

School of Mathematical and Physical Sciences

Usage metrics

    Publications

    Categories

    No categories selected

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC