Jonathan, all:
JONATHAN:
> Dear elephant,
> I find your tone rather reminiscent of Struan - a pity really. It diminishes
> your otherwise valuable input.
ELEPHANT:
I am sorry. I understand the criticism - and nothing worse could be said of
any man's tone. And perhaps you are right. We need people to tell us these
things sometimes. One point: remember that this *is* email (perhaps I
should make use of the smiley device :-)? - No, I think it's mostly used
when not really meant). More to the point, Jonathan, it is only out of my
respect for your contribution and interest in the questions you raise that I
write: no disrespect to your person is intended in my occasionally brusk
remarks in reply to your points - quite the reverse. I am simply trying, in
my own mind, to clarify and express what I have to say as concisely as
possible. You might think some disrespect intended in the brevity of my
replies sometimes - far from it. But you sense that I am dismissive in
other ways? I will try to be less hasty.
JONATHAN:
> It's strange that you dismiss the idea of the "analogue computer" as
> "oxymoronic". The idea has been around for years, and machines by that name
> have been built.
ELEPHANT:
On the last point: that proves nothing. The idea of artificial intelligence
has been around for years, and machines to which its name has been
attributed have also been built. That doesn't stop the whole notion of
artificial intelligence being based on a mistake about what we ordinarily
call 'intelligence'. It is not the processing of data into data but the
creative handling of the projection of data onto the continuous which
constitutes what we have always called intelligence: intelligence as is
displayed in an intelligent scientist. It is possible to redefine
'intelligence' so that Artificial intelligence is not an oxymoron, but only
at the expense of making a calculator in some degree 'intelligent' (not a
high quality outcome).
Similarly, you can have non-oxymoronic 'analogue computers'. But only at
the expense of sacrificing your definition of 'computer' as a machine for
processing data. Machines which process the digital and machines which
process the analogue (by which we mean in the case of clocks the analoguous
to the real, ie the continuous circle of a clock face rather than the
intermittent display of a digital watch) are not the same thing. So long as
you remember that, everything else is just a question of word play. But
even word play can be high quality or low quality. For instance, a
definition of 'computer' which allows machines that process the analogue to
be called computers makes steel plants into computers. Sure - you can and
refinements to exclude this overextension - but why make things so darn
complex when there is a simple and obvious distinction between machines that
process discrete packets of data and those that do not? Whouldn't the
simpler solution be higher quality? What motivates this attachment of
quality to the complex and obscure use of the word - that's what you have to
ask. Depressingly, a frequent motivation for this kind of thing is simply
hype, the need to attrach attention and research grants, or, what's worse,
confused thinking mixed up with the hype and the need for personal success
so that the hypers can't tell the hype from reality.
JONATHAN:
> In your rather hasty and crass discussion of the subject, you missed my direct
> question (in an earlier post), so I repeat it here and await your answer:
> Do you think digital computers perform mathematical calculations?
ELEPHANT:
Well, to state nothing very surprising, it depends what you mean by
"perform". Obviously it's true that computers perform mathematical
calculations: that's what they're there for. But on the other had there's
obviously a big difference between what's going on in a child's mind as he
learns geometry and what occurs inside a computer. The child is thinking
*of* something, he's thinking *about* a mathematical problem. Computers
don't think *about* a mathematical equation, they just *do* it. Now the
child is aiming to get himself into a good habit about this, so that in the
end the gap between the computer and the human "performance" of the maths is
narrowed as much as possible - that's what learning to do the maths properly
might amount to. But all the same, we mustn'ty forget this important
difference in the human and the mechanical starting points when it comes to
maths and the "performance" of mathematical equations. Even in a maths
professor, there's going to be a hang over from that childish state in which
he first started grappling with these problems - and that's just it: his
"performing" the mathematics is a kind of dedicated attention - thinking
*about* something, even if it's thinking not about apples and pears but
about the maths itself, about correct procedure and so forth. With a
machine it's not like that at all. Well it isn't is it? There's really
nothing for the machine to think "about" or to pay attention to: it just
does what it does and that's that. You can call that "performing" the
calculation, if you like. But my guess would be that it's called
"performing" the calculation precisely because it's based on the human
experience of a distance (one of attention) between the function that's to
be acheived (that of the correct mathematician) and the consciouness that's
underneath thinking about how to acheive that function. You can think of
this on analogy with the theatrical notion of the performance of a role
perhaps, but, better, on the idea of performing one's duties in a position
of responsibilty or power, like being a judge or a general. One is
marshalling one's powers and forces to do it right. But in the case of the
computer, there's really no performance or act or attention or marshalling
of this kind: because the being of the computer isn't distinguishable from
it's processing of the data: it just does what it does. That is to say, in
a nutshell, that the computer's mathematics is effortless: it is not a skill
or a discipline. I would say that the notion of skill and discipline were
essential to our right to say that so and so "performs" a mathematical
calculation.
JONATHAN:
> I note in the dispute that arose between Marco and Elephant, a central issue
> was their different uses of the word "event". I want to point out that words
> often change their meaning according to context. This even happens when using
> a "scientific" word like gravity (incidentally, the analogue computer first
> came up in the gravity discussion).
> It's interesting to see in our discussion that gravity has been called both a
> force and an acceleration.
> In physics, the two concepts are quite different, and force and gravity have
> different dimensional units (newtons for force, metres per second squared for
> acceleration). As far as Sir Isaac Newton is concerned, he actually changed
> the definition of gravity from force to acceleration, or rather, he clarified
> the distinction. After Sir Isaac, the word gravity took on a new meaning.
ELEPHANT:
Quite so. You state everything that I agree with, and you state it well.
It is the meaning of the word 'gravity' for which the word 'gravity' is a
name, and that named thing arose with Newton and not before.
I understood that newton's invention was that force should be *measured* and
*expressed* as an acceleration of a mass, so that by "force" and
"acceleration" Newton refers to the same movement of the apple, albeit in
different units. Is this not correct? I am aware that there are the
different dimensional units - but I rather think that there is a strict
proportion between them, is there not? All things being equall, double the
force = double the accelleration.
Newton's first law:
"If no unbalanced forces push or pull a body, then that body will stay still
or keep moving with constant velocity. An unbalanced force will cause a
body to accelerate."
Thus the unit of force that is named after Newton is effectively a
measurement of acceleration, one refinement being that in the case of
competing forces these are measurements of *counterfactual* acceleration: ie
how fast the mass would change velocity but for the action of the other
forces (or accelerations) in other directions.
The factor missing from the account so far is that force and acceleration
are distinguished invirtue of the mass of the body in question. Newton's
second law:
"The accelleration of a body is directly proportional to the unbalanced
force aplied to the body, and inversely proportional to the mass of the
body."
or:
force=mass*acceleration
This is really the key Newtonian contribution, and yes, you are absolutely
right, it is a distinction between acceleration and force.
Did I say anything that would have lead you to suppose I thought otherwise?
This division of the world into masses, forces, and accelerations is really
a very powerful intellectual tool. But that's just what it is: an
intellectual tool that a human being (Newton) came up with, not a reality
that is as old as the galaxies with which it is (mostly) in accord.
Is gravity a force or an acceleration? Well it's a force. But since there
is nothing to the notion of force but a mathematical relationship between
mass and acceleration, Ockhams razor would be quite entitled, from a purely
empiricist standpoint, to lop off this notion of force from our ontology
altogether. And after all, nobody has ever *cornered* gravity or indeed
force - nobody can say *what it is* beyond this mathematical relationship.
This being the case, it looks like the notion of "force", and thus the
notion of "gravity", is just a shorthand - an algebraic place-holder that
makes our maths go more smoothly. Someone might say: all there is *really*
in the universe is bodies and accelerations.
I don't think I'd take that view myself - I'm no empiricist. But it does
raise some intresting questions about the relationship between the
intellectual tricks that we find to be high quality (eg newton's second law)
and our ontology: how many classes of beings we should allow to exist.
Supposing a really useful equation turned out to require 14 classes of
being: would the usefulness of the equation automatically lead to our saying
that there are these fourteen kinds of fundamental stuff in the universe?
Perhaps there might be a connection - but would it be an automatic one? Or,
rather, would it be possible to argue that high quality as the mathematics
might be, the quality of an ontological categorisation of the world is a
separate issue? Might one adopt the mathematics as high quality for it's
purpose, but handle the implied ontological commitments with kid gloves?
What do you think? I think that there are separate questions here, the one
about the quality of an ontology, the other about the technological
effectiveness of a bit of maths. These are separate. Acknowledging the
point about ockhams razor, Newton's second law doesn't automatically decide
the question of whether there are fundamentally one, two, three, or more
kinds of stuff in the universe. What it automatically decides is the
question of how much rocket fuel you have to load.
JONATHAN:
> The story doesn't stop there. Acceleration requires the action of a FORCE on a
> MASS, and Newton implicitly accepted the idea of force at a distance. This was
> not acceptable to Einstein; while one can feel the accelerating force of the
> seat back when the car accelerates, the parachutist in free fall cannot feel
> the acceleration towards earth. The parachutist is in free fall, and feels
> himself to be weightless. Thus, in General Relativity, the concept of free
> fall becomes one of following the "natural curvature of space", and a
> gravitation is conceived as the curvature itself (i.e. large masses tend to
> bend space so that objects in free fall are directed towards them). I know
> this all sounds confusing, but it is very common in science that new ideas
> tend to change the meaning of words that have till then been used in different
> ways.
ELEPHANT:
Too true. And along the way they confuse lesser minds. Just as I read this
I'm confused (as many have been) about how space can be curved, given that
"curved" is a spatial decription. You seem to know about this and might be
able to guide us through the detail distinguishing the metaphor from the
maths. Have you any neat clarifications to offer? (please don't recommend
books - I can reread feynman any time but I have alot of other stuff to get
through)
BTW - I never did understand what was supposed to be so worrying about
action at a distance. After all even with the most continguous bodies there
would always be *some* distance, and if force can act over that distance,
there is in principle no reason why the structure of the milky way should
not be what it is in virtue of the structure of galaxies beyond our ken.
With all best wishes and thanks for making me think a tiny little bit,
Puzzled Elephant
MOQ.ORG - http://www.moq.org
Mail Archive - http://alt.venus.co.uk/hypermail/moq_discuss/
MD Queries - horse@wasted.demon.nl
To unsubscribe from moq_discuss follow the instructions at:
http://www.moq.org/md/subscribe.html
This archive was generated by hypermail 2b30 : Sat Aug 17 2002 - 16:01:09 BST