I agree with Jesse. Nature (if that exists) build on redundancies. (As
the UD). So if the substitution level is at the neural neurons,
``slight" changes don't matter.
Of course we don't really know our substitution level. It is consistent
with comp the level is far lower. But then at that level the same rule
operates.
It probably converges to the "linear".
Bruno (PS: I will answer other posts asap).
Le 10-juil.-05, à 20:22, Jesse Mazer a écrit :
> Stathis Papaioannou wrote:
>
>> Nevertheless, I still think it would be *extremely* difficult to
>> emulate a whole brain. Just about every physical parameter for each
>> neuron would be relevant, down to the atomic level. If any of these
>> parameters are slightly off, or if the mathematical model is slightly
>> off, the behaviour of a single neuron may seem to be unaffected, but
>> the error will be amplified enormously by the cascade as one neuron
>> triggers another.
>
> I don't think that follows. After all, we maintain the same
> personality despite the fact that these detailed parameters are
> constantly varying in our own neurons (and the neurons themselves are
> being completely replaced every few months or so); neural networks are
> not that "brittle", they tend to be able to function in broadly the
> same way even when damaged in various ways, and slight imperfections
> in the simulated behavior of individual neurons could be seen as a
> type of damage. As long as the behavior of each simulated neuron is
> "close enough" to how the original neuron would have behaved in the
> same circumstances, I don't think occasional slight deviations would
> be fatal to the upload (but perhaps the first uploads will act like
> people who are slightly drunk or have a chemical imbalance or
> something, and they'll have to experiment with tweaking various
> high-level parameters--the equivalent of giving themselves simulated
> prozac or something--until they feel 'normal' again).
>
> Jesse
>
>
>
http://iridia.ulb.ac.be/~marchal/
Received on Sun Jul 10 2005 - 15:08:25 PDT