On Oct 16, 11:37 pm, Bruno Marchal <marc....domain.name.hidden> wrote:
>
> > If it is ''a'-rtificial' I question the 'natural one' (following
> > Bruno's fear of the (natural?) 'super stupidity'.) Yet I don't think
> > Marc wants to let himself denature into an artifact.
>
> Not necessarily, but look at Saibal's recent answer!
> This raises a question for Mark. What if the "future "SAI"", "SI"
> should we say, are computationalist? Marc, is it ok if those SI
> reincarnate you digitally? Could they decide without your consent
> (without being super-stupid?).
Your points are well taken Bruno. We should be highly suspicious of
any 'authority' that thinks to act without our consent.
As for cryonics, Saibal , I think it's a good option. If necessary,
I'm quite prepared to put myself in the freezer - I have no intention
of getting any older than a biological age of 65 - if I live that long
I might be the first guy in the world to volunteer for a 'live
freeze' (I would probably have to move to a country where there are
laws allowing for assisted suicide though!)
>
> Again, not necessarily. Buddhism, unlike Christianity, has always been
> very aware that "religious truth", once "institutionalized" get wrong
> ...
> To kill the buddha, or sompetimes just the master, is a way to remind
> the monk that they have to find the truth in themsleves and never to
> take any master talk for granted.
>
> > In our (definition-wise) lower mentality it is not likely that we can
> > 'kill' the smarter. So the condition involves the un-possibility, even
> > if we are capable to recognise them
> > - what we are not likely to be.
>
> Agreed. It was just a parabola for driving attention against any use of
> authoritative argument in the field of fundamentals.
> Ah! But the lobian machine too can be shown allergic to such argument.
> It's a universal dissident. Unforunately, humans, like dog are still
> attracted to the practical philosophy according to which the "boss is
> right" (especially when wrong!)
>
> Bruno
>
> PS Perhaps this week I will got the time to send the next post in the
> "observer-moment = Sigma_1 sentence".
>
Well, I'm pleased to hear the lobian machine is a 'universal
dissident'. I wouldn't want to imply that 'the boss is right'. All I
was implying was that (in the case of super-intelligence) the boss
would be *stronger*. Whether the boss is right or not, we little guys
wouldn't have much power so our negotiating power would be seriously
limited initally. The best that could be hoped for from such a
hypothetical 'social contract' in the beginning is that the SI doesn't
hurt us.
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups "Everything List" group.
To post to this group, send email to everything-list.domain.name.hidden
To unsubscribe from this group, send email to everything-list-unsubscribe.domain.name.hidden
For more options, visit this group at
http://groups.google.com/group/everything-list?hl=en
-~----------~----~----~----~------~----~------~--~---
Received on Wed Oct 17 2007 - 02:16:37 PDT