NIPS Whistler

NIPS 2002 Workshop



Universal Learning Algorithms and Optimal Search



Whistler/Blackcomb Resort, B.C., Canada, Saturday 14, December, 2002

Organizers: Jürgen Schmidhuber & Marcus Hutter, IDSIA, Switzerland


Andrei Kolmogorov William of Ockham

Andrei Kolmogorov The theory of probability ... can and should be developed from axioms ...

Entities should not be multiplied unnecessarily William of Occam


Description

Recent theoretical and practical advances are currently driving a renaissance in the fields of Universal Learners (rooted in Solomonoff's universal induction scheme, 1964) and Optimal Search (rooted in Levin's universal search algorithm, 1973). Both are closely related to the theory of Kolmogorov complexity. The new millennium has brought several significant developments including:
  1. Sharp expected loss bounds for universal predictors
    (no i.i.d assumptions necessary)

  2. Theoretically optimal reinforcement learners
    for general computable environments

  3. Computable optimal predictions based on natural priors
    that take algorithm runtime into account

  4. Practical, bias-optimal, incremental, universal search algorithms
    that extend previous non-incremental ones.

Kolmogorov's 100th birthday is a welcome occasion to discuss recent advances in this workshop (which follows the very successful NIPS 2001 Occam and MDL workshops). Topics will also include:
  1. Practical but general MML/MDL/SRM approaches
    with theoretical foundation

  2. Weighted majority approaches (with worst case loss bounds,
    as opposed to expected loss bounds for Bayesmix approaches)

  3. No free lunch theorems for the special case where
    the data is sampled from a uniform distribution.

The optimists are claiming that the future will belong to universal or near-universal learners. They point out that the latter are much more general than traditional reinforcement learners / decision makers depending on strong Markovian assumptions, or than learners based on traditional statistical learning theory, which require often unrealistic iid or Gaussian assumptions. They also claim that due to hardware advances the time is now ripe for optimal search in algorithm space, as opposed to the limited space of reactive mappings embodied by traditional methods such as SVMs and feedforward networks.

The pessimists are claiming otherwise.

We expect a lively discussion between optimists and pessimists.


Participants

Researchers interested in the foundations of machine learning including reinforcement learning, inductive inference, time-series forecasting, and optimization and search in complex domains should be interested in this workshop.

Speakers will include Ray Solomonoff, Paul Vitanyi, Marcus Hutter, Juergen Schmidhuber, and other experts on Kolmogorov complexity and Solomonoff induction, as well as Manfred Warmuth and Nicolo Cesa-Bianchi as representatives of weighted majority approaches. We also expect contrarian speakers from the no free lunch camp.

The workshop will provide a forum for discussing results and problems in any of the above mentioned areas. We hope to examine the future directions and new perspectives that will keep the field lively and growing.


Recommended Introductory Online Reading References


Contact

Address:
   IDSIA
Marcus Hutter
IDSIA
Galleria 2
CH-6928 Manno-Lugano
Switzerland
Phone: +41-91/6108668
Fax: +41-91/6108661
Email: marcus@idsia.ch
HomePage:http://www.idsia.ch/~marcus
Address:
   IDSIA
Jürgen Schmidhuber
IDSIA
Galleria 2
CH-6928 Manno-Lugano
Switzerland
Phone: +41-91/6108662
Fax: +41-91/6108661
Email: juergen@idsia.ch
HomePage:http://www.idsia.ch/~juergen



Workshop Schedule (Saturday 14, December, 2002)

Invited Speakers

Paul Vitanyi, Centrum voor Wiskunde en Informatica, Amsterdam, The Netherlands
Nicolo Cesa-Bianchi, Dipartimento di Tecnologie dell'Informazione, Universita degli Studi di Milano, Crema, Italy
Ray Solomonoff, Oxbridge Research, Cambridge, Ma., USA
Ilya Nemenman, Kavli Institute for Theoretical Physics, University of California, Santa Barbara, USA

Session 1, 7:30 - 10:30

07.30 - 07.45  Juergen Schmidhuber Welcome
07.45 - 08.15  Ray Solomonoff A General System For Incremental Machine Learning
08.15 - 08.45  Coffee break
08.45 - 09.15  Nicolo Cesa-Bianchi Universal Prediction, Game Theory, and Pattern Recognition
09.15 - 09.45  Juergen Schmidhuber Optimal Ordered Problem Solver
09.45 - 10.30  Panel Discussion (Solomonoff, Schmidhuber, Cesa-Bianchi)

Session 2, 16.15 - 19:00

16.15 - 16.45  Ilya Nemenman Universal Learning: A View of a Bayesian
16.45 - 17.15  Paul Vitanyi The Similarity Metric
17.15 - 17.45  Coffee break
17.45 - 18.15  Marcus Hutter Universal Sequential Decisions based on Algorithmic Probability
18.15 - 19.00  Panel Discussion (Nemenman, Vitanyi, Hutter)


Abstracts of Presentations


Wrap Up and Pictures  :-)


Relevant Links

NIPS 2002 Conference
Occam Workshop (NIPS 2001)
Algorithmic Information Theory Resource Page
Job openings at IDSIA on related topics


Ray Solomonoff Leonid Levin

Ray Solomonoff ... Algorithmic Probability can serve as a kind of 'Gold Standard' for induction systems

Only math nerds would call 2500 finite Leonid Levin


© 2002 by Marcus Hutter and Juergen Schmidhuber