Re: QTI & euthanasia

From: Michael Rosefield <rosyatrandom.domain.name.hidden>
Date: Thu, 23 Oct 2008 00:13:27 +0100

Interesting idea. But obviously 'memories' is quite unquantative when you
get down to it: all memories are not equal, some are stored in
longer/shorter-term memories and have differing levels of cross-association
with each other and emotional states, some are being accessed right now, and
personal behavioural tendencies & habits might not all be encapsulable
simply as 'memories' but more as a function of ingrained neural circuit
configurations.

I think perhaps one of the problems here is that no-one yet knows how to
'construct' consciousness - what informational dynamics need to look like,
what's necessary and sufficient, and how to categorise all the processes.
We're in the same sort of position as early biologists - they knew there was
a method of carrying heredity information, but no idea about what it was or
how it worked. We need to discover our version of DNA... and, of course, as
with biology that might only be the beginning.

But to get back to the point: once we can do this, then hypothetically
speaking we can parameterise any particular conscious state, and quantify
divergences from this in any regard. Exactly what the probability
distribution would look like if this experiment would be performed by taking
distances from the original (according to whatever metric is used) as your
set of alternatives is anybody's guess, but I imagine that 'core'
personality aspects would be reflected by which dimensions (possibly using
principle component analysis) show the steepest drop-off.

I get the feeling I've just used a whole lot of words to restate the
obvious....


2008/10/22 Stathis Papaioannou <stathisp.domain.name.hidden>

>
> 2008/10/22 razihassan <RaziHassan1.domain.name.hidden>:
>
> > 2) I'd like to propose a thought experiment. A subject has his brain
> > cells removed one at a time by a patient assistant using a very fine
> > pair of tweezers. The brain cell is then destroyed in an incinerator.
> >
> > Is there a base level of consciousness beyond which, from the pov of
> > the subject, the assistant will be unable to remove any more cells,
> > since conscious experience will be lost? ie is there a minimum level
> > of 'experience' beyond which nature will appear to act to always
> > maintain the physical brain?
> >
> > If there is, does the second law of thermodynamics not suggest that
> > all brains inexorably head towards this quantum of consciousness, for
> > as long as our brains are physical?
>
> The problem you raise is one of personal identity, and can be
> illustrated without invoking QTI. If I am copied 100 times so that
> copy #1 has 1% of my present memories, copy #2 has 2% of my present
> memories, and so on to copy #100 which has 100% of my present
> memories, which copy should I expect to end up as, and with what
> probability? What about if there are a million instantiations of copy
> #1 and one instantiation of the rest? What if there are 10^100^100
> instantiations of copies with 1/10^100 of my present memories - as
> there well might be?
>
>
> --
> Stathis Papaioannou
>
> >
>

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups "Everything List" group.
To post to this group, send email to everything-list.domain.name.hidden
To unsubscribe from this group, send email to everything-list+unsubscribe.domain.name.hidden
For more options, visit this group at http://groups.google.com/group/everything-list?hl=en
-~----------~----~----~----~------~----~------~--~---
Received on Wed Oct 22 2008 - 19:13:35 PDT

This archive was generated by hypermail 2.3.0 : Fri Feb 16 2018 - 13:20:15 PST