David Dowe

On July 18, 2012, in Presenters, Summit, by Adam Ford

David Dowe

In 1997, David Dowe presented the relationship between (algorithmic) information theory and the inductive inference part of intelligence, before similar but independent work by Jose Hernandez-Orallo also in the 1990s.

In the posthumous (2005) book “Statistical and Inductive Inference by Minimum Message Length” by the originator of the Minimum Message Length (MML) principle, Chris Wallace (1933-2004),
(a) David Dowe is given special mention in the preface on page vi,
(b) David Dowe is the outright most mentioned living person in the table of contents, where his name appears twice,
(c) David Dowe is the living person whose name and work are most mentioned in the index,
(d) other than Chris Wallace himself, (in the reference list) David Dowe is the most cited author.

Building on his Dowe & Wallace (1998), Comley & Dowe (2003, 2005) are the first two papers on MML Bayesian nets which can deal with both discrete (multi-valued) and continuous-valued attributes.

Dowe has a recently released 82-page chapter on MML in Elsevier’s (2011) Handbook of the Philosophy of Science – (HPS Volume 7) Philosophy of Statistics, following his 2007 piece in Brit. J. Philos. Sci. In this (Dowe, 2011a) work and in his earlier Dowe (2008a, 2008b), he proves the uniqueness of log(arithm)-loss probabilistic scoring as the _only_ scoring system which remains invariant to re-framing of a question. He has been running a competition on Australian AFL football using this log-loss probabilistic compression-based scoring system every year since 1995, and he introduced a Gaussian or Normal distribution competition on the margin of the game in 1996. In loosely related but less important work, Dowe also showed (2002 [private communication], 2006) how to incorporate prior probabilities into log-loss scoring, essentially by incorporating the entropy of such a prior distribution.

Dowe has conjectured many times from 1998 (p93) through until including Dowe (2011a) that, in order to obtain both statistical invariance and statistical consistency, in general one needs either Minimum Message Length (MML) or a close Bayesian “relative”. Statistical invariance says that we get the same answer whether we use (e.g.) polar co-ordinates (r, theta) or Cartesian co-ordinates (x, y). Statistical invariance is clearly at least aesthetic. Statistical consistency says that we converge arbitrarily closely to the correct answer as we get more and more data.

Statistical problems are known to exist (such as the Neyman-Scott (1948) problem and many variations on this theme) where no method other than MML or a closely-related variant has yet been shown to yield both statistical invariance and statistical consistency.

Dowe (2008ab, 2011a) also introduces the “elusive model paradox”, a way of explaining the Halting problem (Entscheidungsproblem) to those who haven’t studied university computer science.

Wallace & Dowe (1999a, “Minimum Message Length and Kolmogorov Complexity”, Computer J) is the most cited Wallace paper with a co-author still actively working on MML – and was recognised by the Computer Journal in 2005 as the Computer Journal’s most downloaded article.

This has all been achieved with quite a high university teaching load and an all too frequent re-allocation of subjects.

Hernandez-Orallo & Dowe (2010, “Measuring Universal Intelligence: Towards an Anytime Intelligence Test”) has been the Artificial Intelligence journal‘s most downloaded article every week for 5 months in 2011. It provides the first usable general test of intelligence based on (algorithmic) information theory which is finite and can actually be stopped to give a (progressive) score – and it mentions (on p1509 and p1536) the usefulness that this might provide (in quantifying machine intelligence) as we approach the singularity.

In August 2011, Dowe had 3 papers on (algorithmic) information theory and intelligence at the 4th Artificial General Intelligence conference, held at Google, California, U.S.A. These papers variously (i) show the subtle difference between one-part compression and two-part compression in defining intelligence, (ii) introduce the Darwin-Wallace distribution of environments over which to evaluate intelligent performance, and (iii) give results of perhaps the first – albeit preliminary – test using these information-theoretic means of human vs computer program intelligence.

David Dowe was guest editor of the Christopher Stewart WALLACE (1933-2004) memorial special issue of the Computer Journal (Oxford Univ Press) [Vol. 51, No. 5 (Sept. 2008)] and will be chairing the Solomonoff 85th memorial conference in Nov/Dec 2011.

 

Comments are closed.



Looking for something?

Use the form below to search the site:

Still not finding what you're looking for? Drop a comment on a post or contact us so we can
take care of it!