[Corpora-List] QM analogy and grammatical incompleteness

Dominic Widdows widdows at maya.com
Mon Dec 19 04:12:00 CET 2005

Dear Rob,

A few quick thoughts.

Are you on the Conceptual Graphs mailing list
(http://www.conceptualgraphs.org/)? Philosophy and science are often
discussed there, possibly more than is typical on the corpora list.

> If the speed of light were infinite would this still result in an

> Uncertainty

> Principle? I'm wondering because I want to know exactly where the

> Uncertainty

> Principle begins. My own model is just inconsistent orderings of a

> set, say

> according to colour and size, where you know everything about both

> orderings

> at every moment, but only one ordering is possible at any given

> moment. Such

> orderings are fundamentally inconsistent so you get an uncertainty

> principle.

> Is this the same thing as an information lag due to a finite speed of

> light?

Something in common, perhaps. You have a dataset that you could index
along several dimensions, so you start by building a few specific
indexes to optimize for the sorts of queries you expect to have to deal
with most often. Suppose that an unexpected type of query comes in, you
hadn't indexed along this dimension, and so you have to go back and
search the whole dataset linearly. But if the dataset is changing the
whole time, then by the time you've received an answer, the answer
isn't a true reflection of the physical world, and the question may
even be irrelevant. If nature is such a thing that is changing the
whole time, and that is composed of many complex dimensions, it would
seem to follow that you can never be fully informed about nature.

> It hadn't struck me that vector representation is crucial to QM.

> Newton's laws

> apply to vectors, but they are not QM.

Very much not. Newton's Laws depend on knowing the position and
momentum at the same time. These quantities are relied on as constants
of integration when solving an equation of motion (because this is some
version of F=ma, hence a double diferential and you need to integrate
twice). Enter QM, claiming that it is precisely these two constants
that can't be measured at the same time. Popular science often speaks
of relativity as overturning Newton, but compared with QM, relativity
just completes Newtonian mechanics whereas QM denies it its most basic
information requirement.

> I'll have to look at it more closely and see if QM can be predicted

> purely

> from the fact that physical laws can be expressed in terms of vectors.

I think you need some sort of operator algebra as well. But perhaps
this distinction is more to do with defining the boundary between
linear algebra and functional analysis. QM needs probability as well
for generating physically testable predictions, which it does with
astonishing accuracy.

> Is this dependent on the fact that the Fourier coefficients are

> infinite-dimensional?

Not really, I think you run into some of the same conundrums in systems
with finite dimensions.

To your more general points about language. I think that the goal of
complete, predictive knowledge of any complex language system is bound
to lead to disappointment. But I don't think that this invalidates the
goal of getting as much of it right as possible! We know that a
part-of-speech tagger trained on texts in one domain might not do so
well in other domains, but this doesn't at all meant that the system
isn't very valuable. We have to get better at the adaptive part as
well, and there has been plenty of recent and fruitful work to address
this part of the language processing challenge. New fields have to find
their balance between deduction and induction, and it is a shame if we
spurn one another's work too readily.

Best wishes,
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: text/enriched
Size: 3725 bytes
Desc: not available
Url : https://mailman.uib.no/public/corpora-archive/attachments/20051219/d55c3c36/attachment.bin

More information about the Corpora-archive mailing list