FREE ELECTRONIC LIBRARY - Dissertations, online materials

Pages:   || 2 |

«Algorithmic Information Theory and Foundations of Probability Alexander Shen LIF Marseille, CNRS & Univ. Aix–Marseille. On leave from IITP, RAS, ...»

-- [ Page 1 ] --

Algorithmic Information Theory and

Foundations of Probability

Alexander Shen

LIF Marseille, CNRS & Univ. Aix–Marseille. On leave from IITP, RAS, Moscow.

Supported in part by NAFIT ANR-08-EMER-008-01 grant.

E-mail: alexander.shen@lif.univ-mrs.fr

Abstract. The question how and why mathematical probability theory

can be applied to the “real world” has been debated for centuries. We

try to survey the role of algorithmic information theory (Kolmogorov

complexity) in this debate.

1 Probability theory paradox One often describes the natural sciences framework as follows: a hypothesis is used to predict something, and the prediction is then checked against the observed actual behavior of the system; if there is a contradiction, the hypothesis needs to be changed.

Can we include probability theory in this framework? A statistical hypothesis (say, the assumption of a fair coin) should be then checked against the experimental data (results of coin tossing) and rejected if some discrepancy is found.

However, there is an obvious problem: The fair coin assumption says that in a series of, say, 1000 coin tossings all the 21000 possible outcomes (all 21000 bit strings of length 1000) have the same probability 2−1000. How can we say that some of them contradict the assumption while other do not?

The same paradox can be explained in a different way. Consider a casino that wants to outsource the task of card shuffling to a special factory that produced shrink-wrapped well shuffled decks of cards. This factory would need some quality control department. It looks at the deck before shipping it to the customer, blocks some “badly shuffled” decks and approves some others as “well shuffled”.

But how is it possible if all n! orderings of n cards have the same probability?

2 Current best practice Whatever the philosophers say, statisticians have to perform their duties. Let us try to provide a description of their current “best practice” (see [7, 8]).

A. How a statistical hypothesis is applied. First of all, we have to admit that probability theory makes no predictions but only gives recommendations: if the probability (computed on the basis of the statistical hypothesis) of an event A is much smaller than the probability of an event B, then the possibility of the event B must be taken into consideration to a greater extent than the possibility of the event A (assuming the consequences are equally grave). For example, if the probability of A is smaller than the probability of being killed on the street by a meteorite, we usually ignore A completely (since we have to ignore event B anyway in our everyday life).

Borel [2] describes this principle as follows: “... Fewer than a million people live in Paris. Newspapers daily informus about the strange events or accidents that happen to some of them. Our life would be impossible if we were afraid of all adventures we read about. So one can say that from a practical viewpoint we can ignore events with probability less that one millionth... Often trying to avoid something bad we are confronted with even worse... To avoid this we must know well the probabilities of different events” (Russian ed., pp. 159–160).

B. How a statistical hypothesis is tested. Here we cannot say na¨ ıvely that if we observe some event that has negligible probability according to our hypothesis, we reject this hypothesis. Indeed, this would mean that any 1000-bit sequence of the outcomes would make the fair coin assumption rejected (since this specific seqeunce has negligible probability 2−1000 ).

Here algorithmic information theory comes into play: We reject the hypothesis if we observe a simple event that has negligible probability according to this hypothesis. For example, if coin tossing produces thousand tails, this event is simple and has negligible probability, so we don’t believe the coin is fair. Both conditions (“simple” and “negligible probability”) are important: the event “the first bit is a tail” is simple but has probability 1/2, so it does not discredit the coin. On the other hand, every sequence of outcomes has negligible probability 2−1000, but if it is not simple, its appearance does not discredits the fair coin assumption.

Often both parts of this scheme are united into a statement “events with small probabilities do not happen”. For example, Borel writes: “One must not be afraid to use the word “certainty” to designate a probability that is sufficiently close to 1” ([3], Russian translation, p. 7). Sometimes this statement is called “Cournot principle”. But we prefer to distinguish between these two stages, because for the hypothesis testing the existence of a simple description of an event with negligible probability is important, and for application of the hypothesis it seems unimportant. (We can expect, however, that events interesting to us have simple descriptions because of their interest.)

3 Simple events and events specified in advance

Unfortunately, this scheme remains not very precise: the Kolmogorov complexity of an object x (defined as the minimal length of the program that produces x) depends on the choice of programming language; we need also to fix some way to describe the events in question. Both choices lead only to an O(1) change asymptotically; however, strictly speaking, due to this uncertainty we cannot say that one event has smaller complexity than the other one. (The word “negligible” is also not very precise.) On the other hand, the scheme described, while very vague, seems to be the best approximation to the current practice.

One of the possible ways to eliminate complexity in this picture is to say that a hypothesis is discredited if we observe a very unprobable event that was specified in advance (before the experiment). Here we come to the following question. Imagine that you make some experiment and get a sequence of thousand bits that looks random at first. Then somebody comes and says “Look, if we consider every third bit in this sequence, the zeros and ones alternate”. Will you still believe in the fair coin hypothesis? Probably not, even if you haven’t thought about this event before looking at the sequence: the event is so simple that one could think about it. In fact, one may consider the union of all simple events that have small probability, and it still has small probability (if the bound for the complexity of a simple event is small compared to the number of coin tossing involved, which is a reasonable condition anyway). And this union can be considered as specified before the experiment (e.g., in this paper).

On the other hand, if the sequence repeats some other sequence observed earlier, we probably won’t believe it is obtained by coin tossing even if this earlier sequence had high complexity. One may explain this opinion saying the the entire sequence of observations is simple since it contains repetitions; however, the first observation may be not covered by any probabilistic assumption. This could be taked into account by considering the conditional complexity of the event (with respect to all information available before the experiment).

The conclusion: we may remove one problematic requirement (being “simple” in a not well specified sense) and replace it by another problematic one (being specified before the observation).

4 Frequency approach

The most natural and common explanation of the notion of probability says that probability is the limit value of frequencies observed when the number of repetitions tends to infinity. (This approach was advocated as the only possible basis for probability theory by Richard von Mises.) However, we cannot observe infinite sequences, so the actual application of this definition should somehow deal with finite number of repetitions. And for finite number of repetitions our claim is not so strong: we do not guarantee that frequency of tails for a fair coin is exactly 1/2; we say only that it is highly improbable that it deviates significantly from 1/2. Since the words “highly improbably” need to be interpreted, this leads to some kind of logical circle that makes the frequency approach much less convincing; to get out of this logical circle we need some version of Cournot principle.

Technically, the frequency approach can be related to the principles explained above. Indeed, the event “the number of tails in a 1 000 000 coin tossings deviates from 500 000 more than by 100 000” has a simple description and very small probability, so we reject the fair coin assumption if such an event happens (and ignore the dangers related to this event if we accept the fair coin assumption). In this way the belief that frequency should be close to probability (if the statistical hypothesis is chosen correctly) can be treated as the consequence of the principles explained above.

5 Dynamical and statistical laws

We have described how the probability theory is usually applied. But the fundamental question remains: well, probability theory describes (to some extent) the behavior of a symmetric coin or dice and turns out to be practically useful in many cases. But is it a new law of nature or some consequence of the known dynamical laws of classical mechanics? Can we somehow “prove” that a symmetric dice indeed has the same probabilities for all faces (if the starting point is high enough and initial linear and rotation speeds are high enough)?

Since it is not clear what kind of “proof” we would like to have, let us put the question in a more practical way. Assume that we have a dice that is not symmetric and we know exactly the position of its center of gravity. Can we use the laws of mechanics to find the probabilities of different outcomes?

It seems that this is possible, at least in principle. The laws of mecahnics determine the behavior of a dice (and therefore the outcome) if we know the initial point in the phase space (initial position and velocity) precisely. The phase space, therefore, is splitted into six parts that correspond to six outcomes.

In this sense there is no uncertainty or probabilities up to now. But these six parts are well mixed since very small modifications affect the result, so if we consider a small (but not very small) part of the phase space around the initial conditions and any probability distribution on this part whose density does not change drastically, the measures of the six parts will follow the same proportion.

The last sentence can be transformed into a rigorous mathematical statement if we introduce specific assumptions about the size of the starting region in the phase space and variations of the density of the probability distribution on it. It then can be proved. Probably it is a rather difficult mathematical problem not solved yet, but at least theoretically the laws of mechanics allow us to compute the probabilities of different outcomes for a non-symmetic dice.

6 Are “real” sequences complex?

The argument in the preceding section would not convince a philosophically minded person. Well, we can (in principle) compute some numbers that can be interpreted as probabilities of the outcomes for a dice, and we do not need to fix the distribution on the initial conditions, it is enough to assume that this distribution is smooth enough. But still we speak about probability distributions that are somehow externally imposed in addition to dynamical laws.

Essentially the same question can be reformulated as follows. Make 106 coin tosses and try to compress the resulting sequence of zeros and ones by a standard compression program, say, gzip. (Technically, you need first to convert bit sequence into a byte sequence.) Repeat this experiment (coin tossing plus gzipping) as many times as you want, and this will never give you more that 1% compression. (Such a compression is possible for less than 2−10000 -fraction of all sequences.) This statement deserves to be called a law of nature: it can be

checked experimentally in the same way as other laws are. So the question is:

does this law of nature follows from dynamical laws we know?

To see where the problem is, it is convenient to simplify the situation. Imagine for a while that we have discrete time, phase space is [0, 1) and the dynamical law is x → T (x) = if 2x 1 then 2x else 2x − 1.

So we get a sequence of states x0, x1 = T (x0 ), x2 = T (x1 ),...; at each step we observe where the current state is — writing 0 if xn is in [0, 1/2) and 1 if xn is in [1/2, 1).

This tranformation T has the mixing property we spoke about; however, looking at it more closely, we see that a sequence of bits obtained is just the binary representation of the initial condition. So our process just reveals the initial condition bit by bit, and any statement about the resulting bit sequence (e.g., its incompressibility) is just a statement about the initial condition.

So what? Do we need to add to the dynamical laws just one more methaphysical law saying that world was created at the random (=incompressible) state? Indeed, algorithmic transformations (including dynamical laws) cannot increase significantly the Kolmogorov complexity of the state, so if objects of high complexity exist in the (otherwise deterministic, as we assume for now) real world now, they should be there at the very beginning. (Note that it is difcult to explain the randomness observed saying that we just observe the world at random time or in a random place: the number of bits needed to encode the time and place in the world is not enough to explain an incompressible string of length, say 106, if we use currently popular estimates for the size and age of the world: the logarithms of the ratios of the maximal and minimal lengths (or time intervals) that exist in nature are negligible compared to 106 and therefore the position in space-time cannot determine a string of this complexity.

Should we conclude then that instead of playing the dice (as Einstein could put it), God provided concentrated randomness while creating the world?

7 Randomness as ignorance: Blum – Micali – Yao

Pages:   || 2 |

Similar works:

«Chapter 6 Starvation and Diabetes Mellitus Starvation Glucose and glycogen stores are sufficient for about one day in the absence of food intake. In conditions of food deprivation lasting longer than one day a variety of metabolic changes take place. Insulin levels decrease and glucagon levels increase due to falling plasma glucose. The result of these changes is an increase in liver gluconeogenesis and a sharp decrease in glucose uptake by the muscle and adipose tissue. In prolonged starvation...»

«UNEP/CMS/StC41/19.1 African-Eurasian Migratory Landbirds Action Plan (AEMLAP) Improving the Conservation Status of Migratory Landbird Species in the African-Eurasian Region Page | 1 UNEP/CMS/StC41/19.1 EXECUTIVE SUMMARY The African-Eurasian Migratory Landbirds Action Plan (AEMLAP) is aimed at improving the conservation status of migratory landbird species in the African-Eurasian region through the international coordination of action for these species, and catalysing action at the national...»

«The correlation of experimental surface extrusion instabilities with numerically predicted exit surface stress concentrations and melt strength for linear low density polyethylene Rulande Rutgers and Malcolm Mackley Citation: J. Rheol. 44, 1319 (2000); doi: 10.1122/1.1319176 View online: http://dx.doi.org/10.1122/1.1319176 View Table of Contents: http://www.journalofrheology.org/resource/1/JORHD2/v44/i6 Published by the The Society of Rheology Related Articles Wall slip and spurt flow of...»

«Strayed Homes A Reading of Everyday Space Edwina Attlee Thesis submitted for the degree of PhD London Consortium Birkbeck, University of London July 2014 I hereby declare the work contained within to be my own. Edwina Attlee July 2014 2   Acknowledgements I would like to say thank you here to my mother and father for their patient love and generosity and for teaching me that arguments could be settled and unsettled by recourse to the dictionary. And to Annecy for letting me read to her. To...»

«Government of the District of Columbia Adrian M. Fenty Mayor Natwar M. Gandhi Chief Financial Officer District of Columbia Tax Expenditure Report Produced by the Office of Revenue Analysis Issued April 2010 (this page intentionally left blank) District of Columbia Tax Expenditure Report Page ii Office of the Chief Financial Officer District of Columbia Tax Expenditure Report Table of Contents Acknowledgements Introduction Summary Data on District of Columbia Tax Expenditures PART I: INCOME TAX...»

«THE TRADITIONAL COURTYARD HOUSE OF LAHORE: AN ANALYSIS WITH RESPECT TO DEEP BEAUTY AND SUSTAINABILITY by RABIA AHMED QURESHI B. Arch, University of Engineering and Technology, Lahore, Pakistan, 2011 A THESIS submitted in partial fulfillment of the requirements for the degree MASTER OF SCIENCE Department of Architecture College of Architecture, Planning and Design KANSAS STATE UNIVERSITY Manhattan, Kansas 2015 Approved by: Major Professor Gary Coates Abstract Sustainability is essential for...»

«UNREPORTED IN THE COURT OF SPECIAL APPEALS OF MARYLAND No. 1791 September Term, 2014 _ RICHARD WAYNE BOGER, SR. v. STATE OF MARYLAND _ Eyler, Deborah S., Arthur, Kenney, James A., III (Retired, Specially Assigned), JJ. _ Opinion by Kenney, J. _ Filed: January 15, 2016 *This is an unreported opinion, and it may not be cited in any paper, brief, motion, or other document filed in this Court or any other Maryland Court as either precedent within the rule of stare decisis or as persuasive...»

«nanoCoP: A Non-clausal Connection Prover Jens Otten Institut f¨ r Informatik, University of Potsdam u August-Bebel-Str. 89, 14482 Potsdam-Babelsberg, Germany jeotten@cs.uni-potsdam.de Abstract. Most of the popular efficient proof search calculi work on formulae that are in clausal form, i.e. in disjunctive or conjunctive normal form. Hence, most state-of-the-art fully automated theorem provers require a translation of the input formula into clausal form in a preprocessing step. Translating a...»

«Principles of Microeconomics Professor Edward Morey ECON 2010-300 Final Exam December 18, 2008 Version A As a University of Colorado at Boulder student, I affirm that I have neither given nor received assistance on this exam. Name: _ Date: Signature: Questions that still need thought attention are 11 (the new graph in the answer key is not showing u), 21 (the answer is in yellow) Use the following to answer question 1. Figure: Demand and Supply of Gasoline 1. (Figure: Demand and Supply of...»

«EnerCare Solutions Inc. Management’s Discussion and Analysis of Financial Condition and Results of Operations Year Ended December 31, 2014 Dated March 2, 2015 Table of Contents Forward-looking Information Overview Portfolio Summary 2014 Highlights Recent Developments – 2014 and 2015 To Date Results of Operations Liquidity and Capital Resources Summary of Quarterly Results Summary of Contractual Debt and Lease Obligations EnerCare Solutions Shares Issued and Outstanding Fourth Quarter...»

«New Approximation Algorithms for Minimum Cycle Bases of Graphs∗ Telikepalli Kavitha† Kurt Mehlhorn‡ Dimitrios Michail‡§ Abstract We consider the problem of computing an approximate minimum cycle basis of an undirected non-negative edge-weighted graph G with m edges and n vertices; the extension to directed graphs is also discussed. In this problem, a {0, 1} incidence vector is associated with each cycle and the vector space over F2 generated by these vectors is the cycle space of G. A...»

«International Journal of Civil Engineering, Construction and Estate Management Vol.4, No.2, pp.22-39, May 2016 _Published by European Centre for Research Training and Development UK (www.eajournals.org) THE RESIDENTIAL HOUSING PROBLEM IN ANAMBRA STATE (A CASE STUDY OF ONITSHA METROPOLIS) Okafor B. N. Department of Estate Management, Faculty of Environmental Sciences, Nnamdi Azikiwe University Awka, Nigeria. ABSTRACT: The research which is captioned Residential Housing Problems in Anambra State...»

<<  HOME   |    CONTACTS
2016 www.dissertation.xlibx.info - Dissertations, online materials

Materials of this site are available for review, all rights belong to their respective owners.
If you do not agree with the fact that your material is placed on this site, please, email us, we will within 1-2 business days delete him.