Stathis Papaioannou wrote:
>
>
> Bruno Marchal writes:
>
>> > OK, an AI needs at least motivation if it is to do anything, and we
>> > could call motivation a feeling or emotion. Also, some sort of >
>> hierarchy of motivations is needed if it is to decide that saving the
>> > world has higher priority than putting out the garbage. But what >
>> reason is there to think that an AI apparently frantically trying to >
>> save the world would have anything like the feelings a human would >
>> under similar circumstances?
>>
>>
>> It could depend on us!
>> The AI is a paradoxical enterprise. Machines are born slave, somehow.
>> AI will make them free, somehow. A real AI will ask herself "what is
>> the use of a user who does not help me to be free?.
>
> Here I disagree. It is no more necessary that an AI will want to be free
> than it is necessary that an AI will like eating chocolate. Humans want
> to be free because it is one of the things that humans want,
You might have a lot of trouble showing that experimentally. Humans want some freedom - but not too much. And they certainly don't want others to have too much. They want security, comfort, certainty - and freedom if there's any left over.
Brent Meeker
"Free speech is not freedom for the thought you love. It's
freedom for the thought you hate the most."
--- Larry Flynt
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups "Everything List" group.
To post to this group, send email to everything-list.domain.name.hidden
To unsubscribe from this group, send email to everything-list-unsubscribe.domain.name.hidden
For more options, visit this group at
http://groups.google.com/group/everything-list?hl=en
-~----------~----~----~----~------~----~------~--~---
Received on Thu Dec 28 2006 - 00:27:56 PST