Re: consciousness based on information or computation?
Gilles HENRI, <Gilles.Henri.domain.name.hidden>, writes:
> I personnaly deny the concept of "measure of conscious experiment". I deny
> the fact that consciousness is an *objective* property of matter, just
> because you can not define a physical property, measurable by an external
> apparatus, whose measure could determine the degree of consciousness (or if
> you can, let me know).
I don't think it is that simple. Consciousness is an information
property. It is analogous to asking whether a computer program running
on a piece of hardware has some property.
If you stare at the CPU and memory of a computer, and ask whether it is
running a particular program, it seems very hard to tell. The hardware
appears inert, superficially. It looks the same whether it is running
one program or another.
However, on closer inspection, you can actually get some information.
If you probe the hardware and examine the electrical signals in detail,
you can learn what program is being executed, and begin to answer
questions about it.
The same thing is true of the brain, or other physical object. We can
probe it and learn about it and develop theories about what is going
on internally.
Now, with consciousness, we have an additional problem, because we aren't
sure what kinds of programs must be running to be conscious. However,
as we examine more brains in detail, we should be able to come up with
theories which at least say that certain kinds of programs, certain
kinds of brain activities, are correlated with consciousness. This may
not be a complete or exhaustive listing of all forms of consciousness,
but conservatively we can say that these patterns do indicate the kind
of consciousness that most people have.
> Much of the discussion about consciouness is plagued by this fact, because
> we include it in formalisms that have not been deviced to handle it. The
> only consciousness we know is our own one, by means that are different from
> those we use to interact with the outer world. We think that the other
> human beings are conscious because of the similarity of their behavior with
> ours, but it does not define what is consciousness. If one succeeds in
> building a computer with a human-like behaviour (which is quite possible in
> my sense), deciding if it is actually conscious or not is purely a matter
> of convenience, not an intrinsic property. In other words, I think the
> proposition "Another than me is conscious" is really unprovable.
Provability is not the issue. No propositions are absolutely provable.
We want to come up with theories about the universe. Theories are
inherently unprovable, but are based on our detection of regularities
and patterns in nature.
The problem with theories of consciousness isn't provability, it is
testability. How would we ever test the predictions of the theory?
In fact, there are some circumstances in which such testing can be done.
If the theory predicts that altering brains in certain physical ways
should result in particular changes in consciousness, we can try these
out and ask the people what they experience. Solipsists can even alter
their own brains so as to be sure of the results.
However, we cannot use this method to extend our theories much beyond
ordinary human brains. It is hard to see how we could ever reach
agreement about whether various kinds of animals are conscious. And in
the case of computer programs, if our theories predict that they are
conscious (based on patterns of brain behavior), and the programs say they
are conscious, some people may still maintain that the programs are not.
The theory of consciousness faces many difficulties, but despite these
issues, it is not completely impossible. In particular, it should be
possible to construct conservative, limited, testable theories based
on observations of human brains, which should achieve wide agreement
and acceptance.
Hal
Received on Thu Jan 14 1999 - 08:36:25 PST
This archive was generated by hypermail 2.3.0
: Fri Feb 16 2018 - 13:20:06 PST