- Contemporary messages sorted: [ by date ] [ by thread ] [ by subject ] [ by author ] [ by messages with attachments ]

From: <GSLevy.domain.name.hidden>

Date: Tue, 18 Jul 2000 18:36:30 EDT

In a message dated 07/11/2000 7:44:22 PM Pacific Daylight Time,

meekerdb.domain.name.hidden writes:

*> > Well, we must start somewhere. But it is fun.
*

*> >
*

*> > How about casting the SE with Psi as a relative quantity just like
*

*> position
*

*> > or velocity? In other words we would be always talking about Delta Psi =
*

*> Psi1
*

*> > - Psi0 rather then absolute Psi, where Psi0 would correspond to the
*

*> > (coordinate system of the) observer.
*

*>
*

*> I think that would need to be the ratio, not the difference, since psi
*

needs

*> to
*

*> represent the square-root of a probability.
*

*>
*

*> Brent Meeker
*

*>
*

*>
*

Brent, taking a ratio is interesting! Going one step further and taking the

logarithm of the ratio Psi1/Ps0 generates something that looks like

information. Information can be added and subtracted in a linear fashion

just like velocity and position.

In other words SE can be converted to express a wave of information.

Psi can be converted to probability but only after normalization which

involves taking the ratio (|Psi|**2 Dx)/Integral(|Psi|**2 dx. This is done to

make sure that the range of probability is between 0 and 1.

The mutual information between two events x and y can be expressed by taking

the log of the ratio of their square of the normalized Psis. In other words:

Mutual Information between event x and event y

= log ( |Psi(y|x)|**2 / |Psi(y)|**2 )

= log ( |Psi(x|y)|**2 / |Psi(x)|**2 )

= log ( |Psi(x,y)|**2 /( (|Psi(y)|**2) (

|Psi(y)|**2)))

To get the above, I just applied the equation defining mutual information to

the relationship between probability and normalized Psi.

We see now that information can range from + infinity to - infinity and can

be added or subtracted in a linear fashion. Measure expressed in terms of

infomation is relative not absolute.

I will expand on this when I have more time.

George

Received on Tue Jul 18 2000 - 15:45:50 PDT

Date: Tue, 18 Jul 2000 18:36:30 EDT

In a message dated 07/11/2000 7:44:22 PM Pacific Daylight Time,

meekerdb.domain.name.hidden writes:

needs

Brent, taking a ratio is interesting! Going one step further and taking the

logarithm of the ratio Psi1/Ps0 generates something that looks like

information. Information can be added and subtracted in a linear fashion

just like velocity and position.

In other words SE can be converted to express a wave of information.

Psi can be converted to probability but only after normalization which

involves taking the ratio (|Psi|**2 Dx)/Integral(|Psi|**2 dx. This is done to

make sure that the range of probability is between 0 and 1.

The mutual information between two events x and y can be expressed by taking

the log of the ratio of their square of the normalized Psis. In other words:

Mutual Information between event x and event y

= log ( |Psi(y|x)|**2 / |Psi(y)|**2 )

= log ( |Psi(x|y)|**2 / |Psi(x)|**2 )

= log ( |Psi(x,y)|**2 /( (|Psi(y)|**2) (

|Psi(y)|**2)))

To get the above, I just applied the equation defining mutual information to

the relationship between probability and normalized Psi.

We see now that information can range from + infinity to - infinity and can

be added or subtracted in a linear fashion. Measure expressed in terms of

infomation is relative not absolute.

I will expand on this when I have more time.

George

Received on Tue Jul 18 2000 - 15:45:50 PDT

*
This archive was generated by hypermail 2.3.0
: Fri Feb 16 2018 - 13:20:07 PST
*