Horse (horse@wasted.demon.nl)
Wed, 29 Jul 1998 13:53:58 +0100
Hi Jonathan and LS
> I'd like to offer my support to Horse on the "fuzzy logic issue"
> because I think that Magnus's and Platt's comments have in my
> opinion missed the main issue.
I think Platt is less rigid on this than Magnus, but Magnus's
objections _seem_ to stem from misunderstanding. I might be
wrong and I honestly don't want to seem patronising.
>I would describe the issue as one of DIGITAL vs. ANALOGUE
>logic. Most of us know that the average computer is a "digital"
> system and programmed with digital logic. On the other hand,
>the average human is much faster at doing analogue evaluation
> than digital calculations. Consider looking at a graph or chart
> compared to a list of numbers! Graphs are analogue >
representations - what people find easiest to look at.
>
Precisely. A visual representation is more in line with the way we
understand 'reality'. A picture is worth a thousand words.
>Now consider the way many of us use computers today. The
>interface is usually some variant of Windows (Microsoft, Apple or
>Xerox flavours), which I consider ANALOGUE interfaces. By that I
>mean that the user is primarily concerned with the spatial
>relationship of pictorial objects on his screen. That's very different
>from the digital command-line interfaces of a few years ago.
>But the computer still works internally by digital logic. All those
>nice pictures and graphs are encoded in millions of bits and
>bytes. At that level, it ain't very pretty. Modern computer
>applications are often bloated because we aren't very good at
>designing and programming them to emulate the analogue from
>the digital, and do it by the sledgehammer approach.
>Most people consider the human brain as also operating digitally,
>with the "bits" represented by molecular interactions. However,
>the exact relationships between thought and manipulation of
>those bits is quite obscure.
>Nevertheless, it is clear that the brain is excellent at manipulating
>certain types of analogue patterns e.g. pictures, sounds, motor
>control.
>These skills are exploited by the modern computer interface (inc.
>the mouse for motor control).
Which is why the next big shift in computer interfaces will be
towards virtual reality systems. I also think there will be a shift
back towards analogue computer systems (or better digital
emulation of analogue systems). The use of neural networks and
fuzzy logic are the initial stages of this shift. Especially in neural
networks, which are becoming more closely aligned with
probabalistic methods - i.e. Bayes theorem and the like.
> On the other hand, numbers, grammar,
> dialectics and computer command lines push us towards digital
binary
> thinking (A *or* NOT A). We do this using clumsy discreet
"objects"
> which are sometimes too large for the task.
There's still a place, at present, for the above but they will
eventually be superceded by more 'natural' methods as technology
advances. Bio-electronics seem to be exploring the analogue
aspects of computer systems.
>
> Magnus wrote:
> >I think one of the reasons I don't like [fuzziness] is that it
> >sounds too much like contradiction, or platypus. Just
> >think about the original platypus, the animal that was
> >both a mammal and laid eggs. It wasn't a mammal and it
> >wasn't a reptile, so it must have been something fuzzy in
> >between!? We all know that it was the stupid
> >classification of mammals vs. reptiles that got us into the
> >mess in the first place.
> Exactly! The class objects (mammals or reptiles) were too big.
They
> could only be used by approximation, leaving a large rounding
error!
> It's similar to the problem of buying a sandwich with a large
> denomination banknote. The crisp, rigid Boolean (binary) logic
system is
> a cause of the fuzziness.
There's also the problem of trying to force 'reality' into a box, or set
of boxes, and then getting upset when that same 'reality' refuses to
play the game.
>
> Thus it is wrong to think of "fuzzy logic" as fuzzy thinking. Its goal
> is precision.
Right on the button! Fuzzyness is a means of softening the edges
of previously rigid catagorization.
>
> Horse, how did I do?
>
Great. I don't feel quite so isolated now. Something I've tried to
make clear in the past is that fuzzification is NOT a substitute for
the MoQ. It's a useful way of breaking down the rigidity of SOM
thought. Strict/rigid catagorisation isn't the way to produce an
easily understood view of the world. The view of 'reality' as patterns
of value benefits from removal of strict delimitation of catagories.
Horse
"Making history, it turned out, was quite easy.
It was what got written down.
It was as simple as that!"
Sir Sam Vimes.
-- homepage - http://www.moq.org/lilasquad unsubscribe/queries - mailto:lilasquad@moq.org
This archive was generated by hypermail 2.0b3 on Thu May 13 1999 - 16:43:29 CEST