> This is true, as far as it goes, for any pre-given computation and
> given infinite resources (much as Turing machines emulating other
> Turing machines can require infinite tapes). In more realistic
> senses, however, its typically faster to use a Turing machine to model
> a function you know ... a function you don't is often doable with a
Realistically you wouldn't use a Turing machine for anything other
than proving theorems about computation. I would use a real processor
like a Pentium and a modern compiler to model a function I know.
> learning NNet /of sufficent size and complexity/ faster than trying to
> figure out the formulea and coding it up.
True, perhaps in conjunction with a genetic algorithm or simulated
annealing. Backpropagation and other supervised learning techniques
are becoming obsolete.
> That still doesn't mean that there's `computation' going on inside the
> NNet, just as there isn't really `planning' going on inside a
> spreading activation agent network. Certain things threshold at
> certain times and certain actions/behaviours result.
You are using "computation" in a sense I don't recognize. What do
you mean by it?
> ObMemetics: This may point to an unconscious point of slop in the
> terms we use without thinking about it. We use `thinking' a lot, and
> seem to imply some active reorganization of memes in the process, but
I wouldn't think the memes would need to be reorganized in the process
of thinking, only act.
-- David McFadzean david@lucifer.com Memetic Engineer http://www.lucifer.com/~david/ Church of Virus http://www.lucifer.com/virus/