Re: The Irreducibility of Consciousness

From: John M <jamikes.domain.name.hidden>
Date: Sun, 6 Aug 2006 18:31:17 -0400

Brent:
My idea was exactly what you thundered against. There is no adequate proof,
science is a limited model-view, the quote from J, Neumann even more so and
the court-proof is the compromise (called law) between conflicting interests
in a society. Reasonable doubt relies on how stupid the contemplators are.
The 'model' you formulate and examine is based on a limited view of already
esta blished circle of relevance within those explanations people sweated
out based on inadequate observational methods, immature conditions and
thought limited by the appropriate era's epistemic cognitive inventory.
\Disregarding the 'rest' (maybe not even knowing about more at that time_).
I am not sitting in a complacent lukewarm water of a limited knowledge-base
and cut my thinking accordingly - rather confess to my ignorance and TRY to
comeup with better.
I am not alone in this, not too efficient either.

John M
----- Original Message -----
From: "Brent Meeker" <meekerdb.domain.name.hidden>
To: <everything-list.domain.name.hidden>
Sent: Sunday, August 06, 2006 5:15 PM
Subject: Re: The Irreducibility of Consciousness



Stathis Papaioannou wrote:
> John M writes (quoting SP):
>
>
>>St:
>>Are you suggesting that a brain with the same
>>pattern of neurons firing, but without the appropriate environmental
>>stimulus, would not have exactly the same conscious experience?
>>
>>[JM]:
>>Show me, I am an experimentalist. First show two brains with the same
>>pattern of (ALL!) neuron firings. Two extracted identical firings in a
>>superdupercomplex brain is meaningless.
>>Then, please, show me (experimentally) the non-identity of environmental
>>impacts reaching 2 different brains from the unlimited interaction of the
>>totality.
>>(I wrote already that I do not approve thought-experiments).
>
>
> Of course, you could not have both brains stimulated in the usual manner
> in
> both environments because then they would not have identical patterns of
> neural firing; you would have to artificially stimulate one of the brains
> in exactly
> the right manner to mimic the stimulation it would receive via its sense
> organs.
> That would be very difficult to achieve in a practical experiment, but the
> question
> is, *if* you could do this would you expect that the brains would be able
> to guess
> on the basis of their subjective experience alone which one was which?
>
> Actually, "natural" experiments something like this occur in people going
> through a
> psychotic episode. Most people who experience auditory hallucinations find
> it
> impossible to distinguish between the hallucination and the real thing:
> the voices
> sound *exactly* as it sounds when someone is talking to them, which is why
> (if
> they are that sort of person) they might assault a stranger on the train
> in the belief
> that they have insulted or threatened them, when the poor fellow has said
> nothing
> at all. I think this example alone is enough to show that it is possible
> to have a
> perception with cortical activity alone; you don't even need to
> artificially stimulate
> the auditory nerve.
>
>
>>St:
>>That would imply some sort of extra-sensory perception, and there is
>>no evidence for such a thing. It is perfectly consistent with all the
>>facts
>>to say that consciousness results from patterns of neurons firing in the
>>brain, and that if the same neurons fired, the same experience would
>>result regardless of what actually caused those neurons to fire.
>>
>>[JM]:
>>regardless also of the 'rest of the brain'? Would you pick one of the
>>billions copmpleting the brainwork complexity and match it to a similar
>>one
>>in a different complexity?
>>But the more relevant question (and I mean it):
>>What would you identify as (your version) of "consciousness" that "results
>>from neuron-fiting" consistent with all the facts?
>
>
> My neurons fire and I am conscious; if they didn't fire I wouldn't be
> conscious,
> and if they fired very differently to the way they are doing I would be
> differently
> conscious. That much, I think, is obvious. Maybe there is something *in
> addition*
> to the physical activity of our neurons which underpins consciousness, but
> at the
> moment it appears that the neurons are both necessary and sufficient, so
> you
> would have to present some convincing evidence (experimental is always
> best, as
> you say, but theoretical will do) if you want to claim otherwise.
>
>
>>St:
>>As for consciousness being fundamentally irreducible, I agree
>>completely.
>>
>>[JM]:
>>Consider it a singularity, a Ding an Sich? Your statement looks to me as
>>referring to a "thing". Not a process. Or rather a state? (Awareness??)
>>*
>>St:
>>It is a fact that when neurons fire in a particular way, a conscious
>>experience results; possibly, complex enough electronic activity in a
>>digital computer might also result in conscious experience, although we
>>cannot be sure of that. But this does not mean that the conscious
>>experience
>>*is* the brain or computer activity, even if it could somehow be shown
>>that
>>the physical process is necessary and sufficient for the experience.
>>
>>[JM]:
>>I hope you could share with us your version of that "conscious experience"
>>as well, which "could" be assigned to a digital computer? What "other"
>>activity may a digital computer have
>>beside "electronic"?
>>It is hard to show in 'parallel' observed phenopmena whether one is
>>'necessary' for the other, or just observervable in parallel? Maybe "the
>>other" is necessary for the 'one'?
>>If you find that the 'physical' process (firing, or electronic) is
>>SUFFICIENT then probably your definition is such that it allows such
>>sufficiency.
>>I may question the complexity of the assigned situation
>>for such simplification,.
>
>
> I don't know that computers can be conscious, and I don't even know that
> computers can emulate human-type intelligent behaviour. Proving the latter
> lies in the domain of experimental science, while proving the former is
> impossible,
> although it is also impossible to *prove* that another person is
> conscious.

I think you setting to high a standard for "prove". If you set the standard
  as in mathematical proof, then the proof is relative to the axioms. If
you set the standard of science, or the "reasonable man" of courtrooms, then
I think it is possible. The scientific proof would be to construct a model
consistent with all observation, which is coherent with all other accept
theories, and which include consciousness as an essential element. The
legal standard for criminal trials is "beyond a reasonable doubt" and that
is certainly proven everyday in courtrooms across the land, since conviction
for many crimes requires intent.

So I would ask by what standard of 'proof' do you assert this impossibility;
and by the same standard, can you prove other people exist - or that you
exist.

Brent Meeker
The sciences do not try to explain, they hardly even try to interpret, they
mainly make models. By a model is meant a mathematical construct which,
with the addition of certain verbal interpretations, describes observed
phenomena. The justification of such a mathematical construct is solely and
precisely that it is expected to work.
--—John von Neumann





-- 
No virus found in this incoming message.
Checked by AVG Free Edition.
Version: 7.1.394 / Virus Database: 268.10.7/410 - Release Date: 8/5/2006
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups "Everything List" group.
To post to this group, send email to everything-list.domain.name.hidden
To unsubscribe from this group, send email to everything-list-unsubscribe.domain.name.hidden
For more options, visit this group at http://groups.google.com/group/everything-list
-~----------~----~----~----~------~----~------~--~---
Received on Sun Aug 06 2006 - 18:33:33 PDT

This archive was generated by hypermail 2.3.0 : Fri Feb 16 2018 - 13:20:12 PST