Re: new paper on computationalism

From: Scott D. Yelich <scott.domain.name.hidden>
Date: Mon, 8 Jan 2001 16:58:01 -0700 (MST)

On Mon, 8 Jan 2001 hal.domain.name.hidden wrote:
> This struck me as funny:
> A disturbing possibility is that the measure of a brain (the # of
> ways in which it could be said to implement its regular computation)
> could depend on the size or structure of the brain. It is conceivable,
> for example, that a creature could be constructed which thinks
> like a human, but whose brain does not implement its computation,
> as mathematically defined, in nearly as many ways. Such a creature
> would be at great risk of enslavement by regular humans. The opposite
> scenario would also be possible. It is to be hoped that measure is
> not sensitive to the details of the construction of a brain.

Why?

This list always talks about everything -- but what is anything? Just
a bit stream?

> Presumably your concern is that if people come to believe that a
> particular creature's brain has less measure than others, we should care
> less about the welfare of that creature, and so would feel comfortable
> in enslaving it, since its suffering wouldn't matter. It seems absurd
> to imagine that the majority of people would allow these untestable
> philosophical musings to drive their ethical judgement on such an
> important issue.

I think the question is intelligence without emotion (ie: pain, fear,
remorse), and thus without morals.

> Anyway, even if it happened, by your own arguments, mistreatment would
> be justified, hence there is no reason to hope for the contrary.

We already mistreat things... we (humans), however, seem to take
offense at being mistreated -- whether or not we can actually do
anything about it at the time.

> Hal
Ironic.

Scott
Received on Mon Jan 08 2001 - 17:12:22 PST

This archive was generated by hypermail 2.3.0 : Fri Feb 16 2018 - 13:20:07 PST