Brent Meeker wrote:
>> in memory of James,
>>
> [...]
>> ==============================
>>
>> 2. A paradox ?
>>
>> I will say that a function f is computable if there is
>> a well defined formal language FL in which I can explained
>> non ambiguously how to compute f on arbitrary input (number).
>> I must be able to recognise if an expression in my language
>> is a grammaticaly correct definition of a function, that is
>> syntax error must be recoverable.
>> All expression in the language must be of finite length, and
>> the alphabet (the set of symbols) of the language is asked to
>> be finite.
>> Now all finite strings, a fortiori the grammatically correct
>> one, form a countable set. Just use the lexicographic order
>> defined above.
>>
>> So the set of function computable with FL is countable.
>> A synonym of countable is *enumerable*, and is used by
>> computer scientist, so I will also use it.
>>
>> So I can enumerate the functions computable with FL, that is
>> I can put them in sequence like f_0, f_1, f_2, f_3, ...
>> going through all functions computable in or with FL.
>>
>> Let us consider the function g defined again by
>>
>> g(n) = f_n(n) + 1
>>
>> Now, g is not only well defined, but is even computable. To
>> compute it on n, just search in the enumeration f_i the nth
>> function, apply it to n (this gives an answer because the f_i
>> are computable) and add 1.
>> So, there is a number k such that g = f_k, but then, again
>>
>> g(k) = f_k(k) = f_k(k) + 1
>>
>> But f_k(k) is a well defined number (f_k is computable), and
>> no numbers can be equal to "itself + 1". Contradiction.
>>
>> Could the set of computable functions in FL be uncountable, after
>> all, or is g not expressible in FL, or what?
>>
>> Where is the error? I let you think a little bit. You can
>> ask question. (Notably in case of notational problem).
>
>
>
>
>I think there is a cheat here. "Computable" requires that the function
>be defined finitely. While
>
> g(n) = f_n(n) + 1
>
>appears to be a finite description, it implicitly calls an the infinite
>list of functions f_n. So it's description is only given in terms of a
>enumerable, but infinite string, and hence does not occur in the list
>f_k.
Of course there is a cheat. But I don't see any infinite string
appearing. The description of g is only given in terms of an
enumerable infinite *list*. And for computing g(n) we need only
to generate the n first element of the list, and this for each n.
I give a hint. There are two ways to correct the argument, two
possible cheats!
Correcting one transforms "the paradox" into a wonderful theorem.
Correcting the other transforms "the paradox" into another
wonderful theorem.
It is a deep "paradox". It is at the root of both theoretical
computer science and mathematical logic. It is worth
to think what is going on here.
So I let you think (and perhaps others) a little bit more ...
Bruno
Received on Thu Aug 23 2001 - 05:32:47 PDT
This archive was generated by hypermail 2.3.0
: Fri Feb 16 2018 - 13:20:07 PST