Re: Consciousness is information?

From: Alberto G.Corona <agocorona.domain.name.hidden>
Date: Sun, 17 May 2009 03:43:05 -0700 (PDT)

The hard problem may be unsolvable, but I think it would be much more
unsolvable if we donīt fix the easy problem, isnīt? With a clear idea
of the easy problem it is possible to infer something about the hard
problem:

For example, the latter is a product of the former, because we
perceive things that have (or had) relevance in evolutionary terms.
Second, the unitary nature of perception match well with the
evolutionary explanation "My inner self is a private reconstruction,
for fitness purposes, of how others see me, as an unit of perception
and purpose, not as a set of processors, motors and sensors, although,
analytically, we are so". Third, the machinery of this constructed
inner self sometimes take control (i.e. we feel ourselves capable of
free will) whenever our acts would impact of the image that others may
have of ourselves.

If these conclusions are all in the easy lever, I think that we have
solved a few of moral and perceptual problems that have puzzled
philosophers and scientists for centuries. Relabeling them as "easy
problems" the instant after an evolutionary explanation of them has
been aired is preposterous.

Therefore I think that I answer your question: itīs not only
information; Itīs about a certain kind of information and their own
processor. The exact nature of this processor that permits qualia is
not known; that’s true, and itīs good from my point of view, because,
for one side, the unknown is stimulating and for the other,
reductionist explanations for everything, like the mine above, are a
bit frustrating.


On May 16, 8:39 pm, Kelly Harmon <harmon....domain.name.hidden> wrote:
> I think your discussing the functional aspects of consciousness.  AKA,
> the "easy problems" of consciousness.  The question of how human
> behavior is produced.
>
> My question was what is the source of "phenomenal" consciousness.
> What is the absolute minimum requirement which must be met in order
> for conscious experience to exist?  So my question isn't HOW human
> behavior is produced, but instead I'm asking why the mechanistic
> processes that produce human behavior are accompanied by subjective
> "first person" conscious experience.  The "hard problem".  Qualia.
>
> I wasn't asking "how is it that we do the things we do", or, "how did
> this come about", but instead "given that we do these things, why is
> there a subjective experience associated with doing them."
>
> So none of the things you reference are relevant to the question of
> whether a computer simulation of a human mind would be conscious in
> the same way as a real human mind.  If a simulation would be, then
> what are the properties that those to two very dissimilar physical
> systems have in common that would explain this mutual experience of
> consciousness?
>
> On Sat, May 16, 2009 at 3:22 AM, Alberto G.Corona <agocor....domain.name.hidden> wrote:
>
> > No. Consciousness is not information. It is an additional process that
> > handles its own generated information. I you donīt recognize the
> > driving mechanism towards order in the universe, you will be running
> > on empty. This driving mechanism is natural selection. Things gets
> > selected, replicated and selected again.
>
> > In the case of humans, time ago the evolutionary psychologists and
> > philosophers (Dennet etc) discovered the evolutionary nature of
> > consciousness, that is double: For social animals, consciousness keeps
> > an actualized image of how the others see ourselves. This ability is
> > very important in order to plan future actions with/towards others
> > members. A memory of past actions, favors and offenses are kept in
> > memory for consciousness processing.  This is a part of our moral
> > sense, that is, our navigation device in the social environment.
> > Additionally, by reflection on ourselves, the consciousness module can
> > discover the motivations of others.
>
> > The evolutionary steps for the emergence of consciousness are: 1) in
> > order to optimize the outcome of collaboration, a social animal start
> > to look the others as unique individuals, and memorize their own
> > record of actions. 2) Because the others do 1, the animal develop a
> > sense of itself and record how each one of the others see himself
> > (this is adaptive because 1). 3) This primitive conscious module
> > evolved in 02 starts to inspect first and lately, even take control of
> > some action with a deep social load. 4) The conscious module
> > attributes to an individual moral self every action triggered by the
> > brain, even if it driven by low instincts, just because thatīs is the
> > way the others see himself as individual. Thatīs why we feel ourselves
> > as unique individuals and with an indivisible Cartesian mind.
>
> > The consciousness ability is fairly recent in evolutionary terms. This
> > explain its inefficient and sequential nature. This and 03 explains why
> > we feel anxiety in some social situations: the cognitive load is too
> > much for the conscious module when he tries to take control of the
> > situation when self image it at a stake. This also explain why when we
> > travel we feel a kind of liberation: because the conscious module is
> > made irrelevant outside our social circle, so our more efficient lower
> > level modules take care of our actions
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups "Everything List" group.
To post to this group, send email to everything-list.domain.name.hidden
To unsubscribe from this group, send email to everything-list+unsubscribe.domain.name.hidden
For more options, visit this group at http://groups.google.com/group/everything-list?hl=en
-~----------~----~----~----~------~----~------~--~---
Received on Sun May 17 2009 - 03:43:05 PDT

This archive was generated by hypermail 2.3.0 : Fri Feb 16 2018 - 13:20:15 PST