Re: Smullyan Shmullyan, give me a real example

From: Hal Finney <hal.domain.name.hidden>
Date: Fri, 19 May 2006 16:17:22 -0700 (PDT)

Bruno writes:
> Meanwhile just a few questions to help me. They are hints for the=20
> problem too. Are you familiar with the following "recursive" program=20
> for computing the factorial function?
>
> fact(0) =3D 1
> fact (n) =3D n * fact(n - 1)
>
> Could you compute "fact 5", from that program? Could you find a similar=20
> recursive definition (program) for multiplication (assuming your=20
> machine already know how to add)?
> Could you define exponentiation from multiplication in a similar way? =20
> Could you find a function which would grow more quickly than=20
> exponentiation and which would be defined from exponentiation like=20
> exponentiation is defined from multiplication? Could you generalize all=20
> this and define a sequence of more and more growing functions. Could=20
> you then diagonalise effectively (=3D writing a program who does the=20
> diagonalization) that sequence of growing functions so as to get a=20
> function which grows more quickly than any such one in the preceding=20
> sequence?

Here's what I think you are getting at with the fairy problem. The point
is not to write down the last natural number, because of course there
is no such number. Rather, you want to write a program which represents
(i.e. would compute) some specific large number, and you want to come up
with the best program you can for this, i.e. the program that produces
the largest number from among all the programs you can think of.

If we start with factorial, we could define a function func0 as:

        func0(n) = fact(n)

Now this gets big pretty fast. func0(100) is already enormous, it's like
a 150 digit number. However we can "stack" this function by calling
it on itself. func0(func0(100)) is beyond comprehension. And we can
generalize, to call it on itself as many times as we want, like n times:

        func1(n) = func0(func0(func0( ... (n))) ... )))

where we have nested calls of func0 on itself n times. This really gets
bigger fast, much faster than func0.

Then we can nest func1:

        func2(n) = func1(func1(func1( ... (n))) ... )))

where again we have nested calls of func1 on itself n times. We know
that func1(n) gets bigger so fast, func1(func1(n)) will get bigger
amazingly faster, and of course with n of them it is that much faster yet.

This clearly generalizes to func3, func4, ....

Now we can step up a level and define hfunc1(n) = funcn(n), the nth
function along the path from func1, func2, func3, .... Wow, imagine
how fast that gets bigger. hfunc is for "hyperfunc".

Then we can stack the hfuncs, and go to an ifunc, a jfunc, etc. Well,
my terminology is getting in the way since I used letters instead of
numbers here. But if I were more careful I think it would be possible
to continue this process more or less indefinitely. You'd have program
P1 which continues this process of stacking and generalizing, stacking
and generalizing. Then you could define program P2 which runs P1 through
n stack-and-generalize sequences. Then we stack-and-generalize P2, etc.
It never ends. But it's not clear to me how to describe the process
formally.

So we have this ongoing process where we define a series of functions
that get big faster and faster than the ones before. I'm not sure how we
use it. Maybe at some point we just tell the fairy, okay, let me live
P1000(1000) years. That's a number so big that from our perspective it
seems like it's practically infinite. But of course from the infinite
perspective it seems like it's practically zero.

Hal Finney

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups "Everything List" group.
To post to this group, send email to everything-list.domain.name.hidden
To unsubscribe from this group, send email to everything-list-unsubscribe.domain.name.hidden
For more options, visit this group at http://groups.google.com/group/everything-list
-~----------~----~----~----~------~----~------~--~---
Received on Fri May 19 2006 - 20:54:59 PDT

This archive was generated by hypermail 2.3.0 : Fri Feb 16 2018 - 13:20:11 PST