- Contemporary messages sorted: [ by date ] [ by thread ] [ by subject ] [ by author ] [ by messages with attachments ]

From: Marchal <marchal.domain.name.hidden>

Date: Tue Aug 21 08:59:04 2001

in memory of James,

Hi George, Hi People,

I guess most of you know the famous proof by diagonalisation

of the uncountability or non enumerability of the reals.

To my knowledge diagonalisation appears in the work of

Dubois-Reymond, but it is Cantor who first used it for proving

that the set of reals is bigger, in some sense, than the set of

natural numbers N, or the set of integers Z or the set of

rational numbers Q.

Here I want recall Cantor proof, and then I want to show you

a weird, similar but false diagonalisation reasoning.

The correction of that reasoning will give a shortcut to Godel's

incompleteness result, which is itself a step toward G and G*.

In this post I prove Cantor theorem, and then I give you the

similar but wrong proof. I will let you search the error.

I hope you see that Q is countable. There are simple

drawing proof of that. But without drawing, it is enough

to realise that a rational numbers like -344/671 is described

by a finite string in the alphabet {O, 1, 2, 3, ... 9, -, /}.

And finite strings can be ordered by length, and those

with the same length which remains can be ordered by some

chosen alphabetical order. I call this order (on string) the

*lexicographic* order.

Actually we will be interested by the functions of N to N.

N is the set of natural numbers (positive integers).

============================

So here is a variant of Cantor theorem:

1. Theorem: The set of functions from N to N is NOT countable.

(Note: if you know Cantor proof, just skip it and go to 2. below).

Proof: (by absurdum and diagonalisation)

I recall that a function from N to N is just an assignment for each

natural number (called the argument or input) of a natural numbers,

called the output or value.

exemples:

-the constant function:

input 0 1 2 3 4 5 6 ...

output 1 1 1 1 1 1 1 ...

-the identity function:

input 0 1 2 3 4 5 6 ...

output 0 1 2 3 4 5 6 ...

-the factorial function:

input 0 1 2 3 4 5 6 ...

output 1 1 2 6 24 120 720 ...

-an "arbitrary" function:

input 0 1 2 3 4 5 6 ...

output 10 0 0 3 56 1 35465439087 ...

Of course in this last case, the "..." has only sense

in Plato heaven or in God's Mind.

Suppose now (our absurd hypothesis) that the set of all

function from N to N is countable.

This means that we can count or enumerate the functions.

So we would have a sequence of functions, each associated

to a natural number such that all functions appear in that

sequence. We would have a sequence:

f_0 f_1 f_2 f_3 f_5 f_6 ...

of all functions (from N to N, but this will not be repeated).

Let us put those functions with their value in a matrix:

input 0 1 2 3 4 5 6 ...

f_0 4 7 0 0 0 19 5 ...

f_1 8 6 6 2 1 3 49 ...

f_2 66 36 5 2 4 5 8 ...

f_3 1 2 3 2 1 3 2 ...

f_4 1 2 3 4 5 6 7 ...

f_5 10 0 0 3 56 1 356 ...

f_6 0 10 7 2 2 35 0 ...

... .... ...

Now we will get a contradiction. Indeed we pretend that

all functions belongs to the sequence f_0 f_1 f_2 ...

and this makes the matrix well defined (in Plato heaven

or Cantor paradise, we don't ask that the matrix can be

algorithmicaly generable).

But here is a function, certainly well defined in Plato

Heaven, which, by definition, will not appear in the

matrix. It is the function g which send n on f_n(n) + 1.

g(n) = f_n(n) + 1

In particular g(0) = f_0(0) + 1 = 4+1 = 5;

g(1) = f_1(1) + 1 = 6+1 = 7;

g(2) = 6; g(3) = 3; g(4) = 6, g(5) = 2; g(6) = 1, ...

You see we just change the "diagonal value" so as to be

sure that g is different from f_0 on the value O

(it is f_0(0) + 1), g is different from f_1 on the value 1, etc.

Would have g belong to the list f_0 f_1 f_2 f_3 f_5 ...

A number k would exist such that g = f_k, but then, applying

g on k gives g(k) = f_k(k), but remembering the definition

of g, we have also g(k) = f_k(k) + 1. Contradiction.

The proof does not depend of the choice of the matrix, so that

we have just shown that the set of functions cannot be put

in an exhaustive sequence. N^N is not countable.

(A^B is a notation for the set of function from B to A).

==============================

2. A paradox ?

I will say that a function f is computable if there is

a well defined formal language FL in which I can explained

non ambiguously how to compute f on arbitrary input (number).

I must be able to recognise if an expression in my language

is a grammaticaly correct definition of a function, that is

syntax error must be recoverable.

All expression in the language must be of finite length, and

the alphabet (the set of symbols) of the language is asked to

be finite.

Now all finite strings, a fortiori the grammatically correct

one, form a countable set. Just use the lexicographic order

defined above.

So the set of function computable with FL is countable.

A synonym of countable is *enumerable*, and is used by

computer scientist, so I will also use it.

So I can enumerate the functions computable with FL, that is

I can put them in sequence like f_0, f_1, f_2, f_3, ...

going through all functions computable in or with FL.

Let us consider the function g defined again by

g(n) = f_n(n) + 1

Now, g is not only well defined, but is even computable. To

compute it on n, just search in the enumeration f_i the nth

function, apply it to n (this gives an answer because the f_i

are computable) and add 1.

So, there is a number k such that g = f_k, but then, again

g(k) = f_k(k) = f_k(k) + 1

But f_k(k) is a well defined number (f_k is computable), and

no numbers can be equal to "itself + 1". Contradiction.

Could the set of computable functions in FL be uncountable, after

all, or is g not expressible in FL, or what?

Where is the error? I let you think a little bit. You can

ask question. (Notably in case of notational problem).

Bruno

Received on Tue Aug 21 2001 - 08:59:04 PDT

Date: Tue Aug 21 08:59:04 2001

in memory of James,

Hi George, Hi People,

I guess most of you know the famous proof by diagonalisation

of the uncountability or non enumerability of the reals.

To my knowledge diagonalisation appears in the work of

Dubois-Reymond, but it is Cantor who first used it for proving

that the set of reals is bigger, in some sense, than the set of

natural numbers N, or the set of integers Z or the set of

rational numbers Q.

Here I want recall Cantor proof, and then I want to show you

a weird, similar but false diagonalisation reasoning.

The correction of that reasoning will give a shortcut to Godel's

incompleteness result, which is itself a step toward G and G*.

In this post I prove Cantor theorem, and then I give you the

similar but wrong proof. I will let you search the error.

I hope you see that Q is countable. There are simple

drawing proof of that. But without drawing, it is enough

to realise that a rational numbers like -344/671 is described

by a finite string in the alphabet {O, 1, 2, 3, ... 9, -, /}.

And finite strings can be ordered by length, and those

with the same length which remains can be ordered by some

chosen alphabetical order. I call this order (on string) the

*lexicographic* order.

Actually we will be interested by the functions of N to N.

N is the set of natural numbers (positive integers).

============================

So here is a variant of Cantor theorem:

1. Theorem: The set of functions from N to N is NOT countable.

(Note: if you know Cantor proof, just skip it and go to 2. below).

Proof: (by absurdum and diagonalisation)

I recall that a function from N to N is just an assignment for each

natural number (called the argument or input) of a natural numbers,

called the output or value.

exemples:

-the constant function:

input 0 1 2 3 4 5 6 ...

output 1 1 1 1 1 1 1 ...

-the identity function:

input 0 1 2 3 4 5 6 ...

output 0 1 2 3 4 5 6 ...

-the factorial function:

input 0 1 2 3 4 5 6 ...

output 1 1 2 6 24 120 720 ...

-an "arbitrary" function:

input 0 1 2 3 4 5 6 ...

output 10 0 0 3 56 1 35465439087 ...

Of course in this last case, the "..." has only sense

in Plato heaven or in God's Mind.

Suppose now (our absurd hypothesis) that the set of all

function from N to N is countable.

This means that we can count or enumerate the functions.

So we would have a sequence of functions, each associated

to a natural number such that all functions appear in that

sequence. We would have a sequence:

f_0 f_1 f_2 f_3 f_5 f_6 ...

of all functions (from N to N, but this will not be repeated).

Let us put those functions with their value in a matrix:

input 0 1 2 3 4 5 6 ...

f_0 4 7 0 0 0 19 5 ...

f_1 8 6 6 2 1 3 49 ...

f_2 66 36 5 2 4 5 8 ...

f_3 1 2 3 2 1 3 2 ...

f_4 1 2 3 4 5 6 7 ...

f_5 10 0 0 3 56 1 356 ...

f_6 0 10 7 2 2 35 0 ...

... .... ...

Now we will get a contradiction. Indeed we pretend that

all functions belongs to the sequence f_0 f_1 f_2 ...

and this makes the matrix well defined (in Plato heaven

or Cantor paradise, we don't ask that the matrix can be

algorithmicaly generable).

But here is a function, certainly well defined in Plato

Heaven, which, by definition, will not appear in the

matrix. It is the function g which send n on f_n(n) + 1.

g(n) = f_n(n) + 1

In particular g(0) = f_0(0) + 1 = 4+1 = 5;

g(1) = f_1(1) + 1 = 6+1 = 7;

g(2) = 6; g(3) = 3; g(4) = 6, g(5) = 2; g(6) = 1, ...

You see we just change the "diagonal value" so as to be

sure that g is different from f_0 on the value O

(it is f_0(0) + 1), g is different from f_1 on the value 1, etc.

Would have g belong to the list f_0 f_1 f_2 f_3 f_5 ...

A number k would exist such that g = f_k, but then, applying

g on k gives g(k) = f_k(k), but remembering the definition

of g, we have also g(k) = f_k(k) + 1. Contradiction.

The proof does not depend of the choice of the matrix, so that

we have just shown that the set of functions cannot be put

in an exhaustive sequence. N^N is not countable.

(A^B is a notation for the set of function from B to A).

==============================

2. A paradox ?

I will say that a function f is computable if there is

a well defined formal language FL in which I can explained

non ambiguously how to compute f on arbitrary input (number).

I must be able to recognise if an expression in my language

is a grammaticaly correct definition of a function, that is

syntax error must be recoverable.

All expression in the language must be of finite length, and

the alphabet (the set of symbols) of the language is asked to

be finite.

Now all finite strings, a fortiori the grammatically correct

one, form a countable set. Just use the lexicographic order

defined above.

So the set of function computable with FL is countable.

A synonym of countable is *enumerable*, and is used by

computer scientist, so I will also use it.

So I can enumerate the functions computable with FL, that is

I can put them in sequence like f_0, f_1, f_2, f_3, ...

going through all functions computable in or with FL.

Let us consider the function g defined again by

g(n) = f_n(n) + 1

Now, g is not only well defined, but is even computable. To

compute it on n, just search in the enumeration f_i the nth

function, apply it to n (this gives an answer because the f_i

are computable) and add 1.

So, there is a number k such that g = f_k, but then, again

g(k) = f_k(k) = f_k(k) + 1

But f_k(k) is a well defined number (f_k is computable), and

no numbers can be equal to "itself + 1". Contradiction.

Could the set of computable functions in FL be uncountable, after

all, or is g not expressible in FL, or what?

Where is the error? I let you think a little bit. You can

ask question. (Notably in case of notational problem).

Bruno

Received on Tue Aug 21 2001 - 08:59:04 PDT

*
This archive was generated by hypermail 2.3.0
: Fri Feb 16 2018 - 13:20:07 PST
*