RE: UDA revisited and then some

From: Stathis Papaioannou <stathispapaioannou.domain.name.hidden>
Date: Thu, 7 Dec 2006 15:57:04 +1100

Brent Meeker writes:
> > You're implying that the default assumption should be that
> > consciousness correlates more closely with external behaviour than
> > with internal activity generating the behaviour: the tape recorder
> > should reason that as the CD player produces the same audio output as
> > I do, most likely it has the same experiences as I do. But why
> > shouldn't the tape recorder reason: even though the CD player
> > produces the same output as I do, it does so using completely
> > different technology, so it most likely has completely different
> > experiences to my own.
>
> Here's my reasoning: We think other people (and animals) are conscious, have experiences, mainly because of the way they behave and to a lesser degree because they are like us in appearance and structure. On the other hand we're pretty sure that consciousness requires a high degree of complexity, something supported by our theories and technology of information. So we don't think that individual molecules or neurons are conscious - it must be something about how a large number of subsystems interact. This implies that any one subsystem could be replaced by a functionally similar one, e.g. silicon "neuron", and not change consciousness. So our theory is that it is not technology in the sense of digital vs analog, but in some functional information processing sense.
>
> So given two things that have the same behavior, the default assumption is they have the same consciousness (i.e. little or none in the case of CD and tape players). If I look into them deeper and find they use different technologies, that doesn't do much to change my opinion - it's like a silicon neuron vs a biochemical one. If I find the flow and storage of information is different, e.g. one throws away more information than the other, or one adds randomness, then I'd say that was evidence for different consciousness.
I basically agree, but with qualifications. If the attempt to copy human intelligence is "bottom up", for example by emulating neurons with electronics, then I think it is a good bet that if it behaves like a human and is based on the same principles as the human brain, it probably has the same types of conscious experiences as a human. But long before we are able to build such artificial brains, we will probably have the equivalent of characters in advanced computer games designed to pass the Turing Test using technology nothing like a biological brain. If such a computer program is conscious at all I would certainly not bet that it was conscious in the same way as a human is conscious, just because it is able to fool us into thinking it is human.
Stathis Papaioannou
_________________________________________________________________
Be one of the first to try Windows Live Mail.
http://ideas.live.com/programpage.aspx?versionId=5d21c51a-b161-4314-9b0e-4911fb2b2e6d
--~--~---------~--~----~------------~-------~--~----~
 You received this message because you are subscribed to the Google Groups "Everything List" group.
To post to this group, send email to everything-list.domain.name.hidden
To unsubscribe from this group, send email to everything-list-unsubscribe.domain.name.hidden
For more options, visit this group at http://groups.google.com/group/everything-list?hl=en
-~----------~----~----~----~------~----~------~--~---
Received on Wed Dec 06 2006 - 23:57:21 PST

This archive was generated by hypermail 2.3.0 : Fri Feb 16 2018 - 13:20:12 PST