Information is what Schrodinger’s wavefunction tells. Information that is squared to get the probability of a measurement. Information entropy is I(E) = –log2 p(E), where p(E) is probability of an event. This probability is linked to uncertainty. Heisenberg Uncertainty Principle?
I^2(E) = p, Ψ2 = p
A space of solutions of the wavefunction separated by h bar/2, encoded in bits. Event is somehow linked to measurement.
Something I’m playing with in my mind. Do with it as you may.
I 2(E) = p, Ψ 2 = p
Information is what the Schrodinger’s wavefunction tells.
Information that is squared to get the probability of a
measurement. Probability that is linked to an event, to knowledge.
Information entropy is I(E) = –log2 p(E), where p(E) is probability
of an event. This probability is linked to uncertainty. How?
Heisenberg Uncertainty Principle?
A space of solutions of the wavefunction separated by h bar/2,
encoded in bits. Event is somehow linked to measurement.