Re: How would a computer know if it were conscious?

From: Russell Standish <>
Date: Thu, 14 Jun 2007 00:57:40 +1000

On Thu, Jun 14, 2007 at 10:23:38AM +1000, Colin Hales wrote:
> It may be technically OK then, but I would say the use of the word
> 'creativity' is unwise if you wish to unambiguously discuss evolution to a
> wide audience. As I said...
> I don't think we need a new word....I'll stick to the far less ambiguous
> term 'organisational complexity', I think. the word creativity is so
> loaded that its use in general discourse is bound to be prone to
> misconstrual, especially in any discussion which purports to be assessing
> the relationship between 'organisational complexity' and consciousness.

What sort of misconstruals do you mean? I'm interested...

'organisational complexity' does not capture the concept I'm after.

> The question-begging loop at this epistemic boundary is a minefield.
> [[engage tiptoe mode]]
> I would say:
> (1) The evolutionary algorithms are not 'doing science' on the natural
> world. They are doing science on abstract entities whose relationship with
> the natural world is only in the mind(consciousness) of their grounder -
> the human programmer. The science done by the artefact can be the
> perfectly good science of abstractions, but simply wrong or irrelevant
> insofar as it bears any ability to prescribe or verify claims/propositions
> about the natural world (about which it has no awareness whatever). The
> usefulness of the outcome (patents) took human involvement. The inventor
> (software) doesn't even know it's in a universe, let alone that it
> participated in an invention process.

This objection is easily countered in theory. Hook up your
evolutionary algorithm to a chemsitry workbench, and let it go with
real chemicals. Practically, its a bit more difficult of course, most
likely leading to the lab being destroyed in some explosion.

Theoretical scientists, do not have laboratories to interface to,
though, only online repositories of datasets and papers. A theoretical
algorithmic scientist is a more likely proposition.

> (2) "Is this evolutionary algorithm conscious then?".
> In the sense that we are conscious of the natural world around us? Most
> definitely no. Nowhere in the computer are any processes that include all
> aspects of the physics of human cortical matter.


> Based on this, of the 2 following positions, which is less vulnerable to
> critical attack?
> A) Information processing (function) begets consciousness, regardless of
> the behaviour of the matter doing the information processing (form).
> Computers process information. Therefore I believe the computer is
> conscious.
> B) Human cortical qualia are a necessary condition for the scientific
> behaviour and unless the complete suite of the physics involved in that
> process is included in the computer, the computer is not conscious.
> Which form of question-begging gets the most solid points as science? (B)
> of course. (B) is science and has an empirical future. Belief (A) is
> religion, not science.
> Bit of a no-brainer, eh?

I think you're showing clear signs of carbon-lifeform-ism here. Whilst
I can say fairly clearly that I believe my fellow humans are
conscious, and that I beleive John Koza's evolutionary programs
aren't, I do not have a clear-cut operational test of
consciousness. Its like the test for pornography - we know it when we
see it. It is therefore not at all clear to me that some n-th generational
improvement on an evolutionary algorithm won't be considered conscious
at some time in the future. It is not at all clear which aspects of
human cortical systems are required for consciousness.

A/Prof Russell Standish                  Phone 0425 253119 (mobile)
UNSW SYDNEY 2052         
You received this message because you are subscribed to the Google Groups "Everything List" group.
To post to this group, send email to
To unsubscribe from this group, send email to
For more options, visit this group at
Received on Wed Jun 13 2007 - 20:47:46 PDT

This archive was generated by hypermail 2.3.0 : Fri Feb 16 2018 - 13:20:14 PST