(C)1995 Lee Kent Hempfling All Rights Reserved

A short while ago I was locked in debate with a prominent British Neuroscientist. The conversation turned from joint
intellectual stimulation to attack and counter. I expected his defense mode to kick in but I was honestly surprised at the
extreme of the response. It seemed that he perceived my challenges as a personal attack. No matter how much I tried to
retain my position of equal discussion and still attempt to calm the rhetoric being thrown back I was not able to refrain
from firing a return shot. I guess I should feel bad about that. But I don't.
There was no possible conclusion that I was at fault for his overactive defensiveness. The challenger can present a
position, th en receive rebuttal. Or the challenger can present a position and remain ignored. I am honestly pleased he
chose to retort with vengeance. If he had chosen to simply ignore there would not have been the future discussion we
both enjoyed. If he had chosen to simply make his position known and end it there I would not have resorted to
manipulation. 

Oft times when I find my challenges going unnoticed I will present a deeper challenge. Presuming that conversation is
the goal I will entice the other party into it. Even if it turns on protection warnings. In that conversation I had begun by
making the statement that there was something amiss with the study of the brain. An open enough comment to begin
any form of give and take. His response was in defense of the entire neuroscience community. Many of which, I am sure,
would not have come to his rescue. Far too much research is consummated from competition.

Something amiss with the study of the brain. Perhaps, I added, the study was focused on the wrong thing. The wrong
thing? How dare I charge that noted scientists were taking an improper protocol. I didn't say that. How dare I charge
that the hottest topic of modern science, besides DNA, was being observed incorrectly. I did say that. How dare I
presume a position of superiority over those with years of study and mounds of degrees and accolades on the wall. I
paraphrase that for the children who might read this.

It wasn't the research that was being defended it was the perceived attack on him. His object of interest was personal.
Obviously. Which was my point. The object of interest dominates the potential for research. As it does the potential of
conversation. With the mutual original intent of asking and answering and conversing the conversation turned into  self
preservation.

In an attempt to shatter his self proclaimed need for survival I challenged the community he defended by stating that
neuroscience, and the other venues of research of the brain were taking an object based approach to evaluation. I should
have known better. It was no longer a discussion of the study of the brain. It had turned to a defense of the scientific
method. So I needled more.

Science, I declared, had stopped searching for the causes of things. Oh, shame on me. I should have decided to spend
my time that day basking in some other heat. Science, he asserted, IS the study of the causes of things. To which my
statement that philosophy is the study of the causes of things, science is the study of the things caused, raised even
further fire.

My point still eluded. How could I assert that the very thing standing in the way of a mutually enlightening discussion
was the thing that stood in the way of true advancement in research of the brain? I resorted to comparison. The
researcher seeking the cause of a viral infection will seek the virus. Identify it and concoct a way to kill or disable it. He
agreed. But the brain researcher will seek the cause of emotions and seek the emotion. Identify it and declare it to be
found. But never assume that it must be caused by something. He disagreed. Of course he did.

So I changed the direction. A computer programmer, I asserted, writes code. He agreed. That code is a series of limits.
Instructions, he said. Alright. Call them instructions for now. The instructions set parameters for the action of the
program. Yes. Those parameters each have a point where they hand off the computation to another instruction. He
agreed. That point is the limit of that particular instruction. He agreed. Then hesitated. Then urged me to continue....

In a series of instructions there reaches a point where the final set of instructions must either hand the computation back
to the first instruction for reevaluation or hand the computation off to another set. He agreed. So each instruction is a
limit in itself  that only appears unlimited by it's hand shaking with another instruction. He pondered. OK, he said, if you
insist on called an instruction a limit I'll assume you have a reason.  It seemed that the non reality of a phantom code
instruction was not a personal attack. So I proceeded.

Does the instruction at any time in it's proliferation between other instructions ever gain an advantage over it's next
potential? Excuse me, he said? What I mean is, does the set of instructions ever come to a conclusion ahead of it's input?
Of course not. That is impossible, he replied. Is it? I said. How then would you explain your initial reaction to this
discussion?  What do you mean? He asked. How did you assume by my single statement that neuroscience was
essentially barking up the wrong tree, that I was attacking you? Are you neuroscience? Of course not, he replied. I am a
neuroscientist. My whole life has been spent investigating the brain and you attacked what I do. But I didn't attack you.
Did I? No, not directly. Then you made an assumption before you had the input to do it didn't' you? Well, I guess I did,
he said.

So then. If your brain was in any way concocted of sets of instructions, or rules for action, how could it have come to a
conclusion from a potential it had not as yet experienced? It's the brain, he said, it is not a computer. Well then, how
could a computer, which acts only on what it has been given to react on, and only in the order it has been given it ever
come to a conclusion of potential? Simple, he said. A data base of events contains all of the previously experienced
events. They are checked and the one that worked before is brought up and acted upon. 

So, I said.. You have had this discussion before? No. Of course not, he replied. Then you didn't have a previous
discussion in memory to compare it with did you? I continued. You didn't have a previous discussion that resulted in a
personal attack and seeing a correlation between this one and that one your determination to become defensive
resulted. I've been attacked before, he said. Of course you have. But I was not attacking you. He pondered again. 

Had I committed a personal attack I can understand your becoming defensive. But I simply challenged a protocol. You
took it as a personal attack. So your brain acted without input. It acted before a justifiable input was present. You were
consciously aware of a perception that did not exist. Can a computer do that?

No it can not, he replied, and don't underestimate the power of digital processing. We were not talking about digital
processing at that moment. We were talking about the brain. He continued,  people make assumptions that just because
something happens in the brain and it is not digital it can not be duplicated by a digital computer. That is wrong. If it
happens it can be done in digital. He was finished extolling the virtues of the digital domain long enough for me to jump
back in.

 I am sure you are determined, I said, to defend the medium you have come to accept as normal. It is not. Show me
anything in nature that functions in on-off. He thought for a moment and then replied sheepishly, Well, right off the bat I
can't think of anything, but that does not mean digital is not the way to go in brain replication.

But the brain is natural is it not? He agreed wondering where I was leading this time. So I continued.... The digital
domain makes a natural process able to seem replicated by breaking the parts of that process down into small bits that
strung together in a very fast configuration give the illusion of the natural process. He studied that one for a moment
then agreed again. So I continued.....The sad state of affairs is when the process that is used to make the illusion of a
natural procedure overtakes the natural procedure and people start to base their comparison of natural processes on the
unnatural process that can only imitate it not duplicate it.

Things must have begun to sink in as his delivery slowed and the mental process of thought overtook the process of
expressing it. He was doing what the digital computer can not do. He was thinking. In order for anything to think it must
be able to not only make comparisons and make judgements it must be able to project those comparisons and
judgements to a potential in order to see if a new comparison must be considered and a new judgement must be made.
As he thought I continued on. 

Object based reasoning, sir, I said, is when the object sought overcomes the potential of it's cause and it is used as the
basis to evaluate the object sought without consideration to the cause of the object. It must have been a wake up call. He
objected. I had interrupted his thought process with something that did not compute. That was my point.

Where neuroscience seeks to discover the things about the brain it uses the objects it observes to establish the criteria. In
doing so the causes of those objects are ignored while the object is searched for. So neuroscience studies the results of
the brain and imposes them as potential causes.

Language is a perfect example. It is the object of research. But it is not the causative factor. Hearing is. Language is
comprised of things like syntax, rules of usage, grammar, verbal skills in expressing thoughts, all things humans have
discovered that make up the language process. But just because they were discovered to control the language process
does that mean there are areas in the brain where these things take place?

So then how does the process of hearing turn into the process of speech and language? How is it that people born
without the ability to hear can not speak either? Does the speech or language center of the brain control the hearing
process? Of course not. So the hearing process must then have something to do with the speech process. But how does
it get to that point? 

How does the sound input of the ear turn into the sound output of the mouth and do so in a speed of computation that
is faster than the ear hears? After all, if your hearing and speech processes were controlled by the digital process you
would not speak until you were spoken to. You would only respond to what you heard and you would never have an
original thought that you would ever express without some external stimuli causing it to be uttered. Your brain functions
in a process that is faster than the input. There is no digital computer that can do that no matter how fast it processes
data. But enough for now.

The conversation was about to end from his refusal to accept an opinion contrary to his long held belief of the potential
of digital processing. I had to do something to get his attention back to the talking and away from the his imminent
walking. Object Based Reasoning is being used else where too, I said. Relief. I could sense it. The attack would now be
turned to something other than his own beliefs. He was back in the discussion.

It's happening in every school in America. Oh good, not even the same continent. He was safe. I continued... It's called
Object Based Education. He was slightly familiar with it. It is simply put when the potential goal is evaluated as to it's
parts and the parts are taught to reach that goal. Nothing wrong with that. He asserted. But there is. I claimed.

Let's use an example. Little Bobby needs to learn how to read. So in the first grade in this country the teachers teach him
how to read. They do this based on the concept of what reading is made of. It is made of seeing words, reading those
words which are made up of letters and reciting the same word easily the next time it is seen. It is a memory based
process. But is it? Of course it is, he replied. No it is not, I argued.

Reading is the verbal expression of a written word that represents a concept. In order to understand the concept one
must first understand the word. To understand the word one must first understand what the word is made of. Only by
understanding the individual sounds of the word's parts and then putting them together to make the word will the word
be readable the first time it is seen. Letters are taught as the parts of words but letters are not the parts of words. They
are the vehicles that make up the sounds of words. The sounds of words make up the words. The words make up the
concept.

Same difference, he said. No it isn't, I said. Little Bobby listens to his teacher read the words in his work book. He is then
commanded to learn how to spell them. He does this by knowing the letters. But spelling in the English language most
particularly can not be done correctly all the time only by the letters. Spell the word Lieutenant I commanded. Very
funny, he said. The conversation was being held on a terminal. Ok, just think about the spelling of the word Lieutenant.
Should it be Lutenant? Why not? It sounds like it. For a child who has never seen the word Lieutenant before and has
only been taught that letters make up words it is the memory that takes over. The thought process is not nurtured. There
is no connection to a potential spelling of a word that has not already been seen. By learning the spelling of letters that
make up words the child is forever relegated to little and simple words. A word like Lieutenant that consists of various
syllables can not be spelled by a person who understands only the letters and not how they interact with each other. 

Should the child see the word and try to read it the word might come out like LYEEYOUTEENENTA. That is the
culmination of all of the letter sounds. Nothing is taught as to how they effect each other. Where in the rules of modern
language do you find the description for the use of combined letters? He was silent. Then he said, I BEFORE E EXCEPT
AFTER C. Of course you would say that, I said. That is the one rule everyone is hammered to remember. So rules of
combined letters brings up that one.

But where are the rules that effect the letter "I"? When it is followed by an "E" it is silent. So the word may then become
LEEYOUTEENENTA. Where are the rules to break a work down to groups of two or three letters to establish the syllable
sound? So the word might then become LEEYOUT-EN-EN-TA. Where are the rules to establish the two or three letter
break downs. So the word might come out as LU-TEN-ENTA.... Why do you keep putting an ‘AH' sound at the end of
the word? He asked? Why, haven't you heard a child recite their ABC's? I asked? S-T-U-V... but the letters are taught as
to their sounds so the T can sound like a TAH or a TEE or a short Tih.  It is not taught to be a potential ending. It is taught
to be a separate letter with the same importance as all of the others. If the letter sounds were not taught as the only way
to get to a word the word might come out as LU-TEN-ENT. The child might say it correctly when reading it. But he will
never spell it correctly.

But then who cares, I asked. The goal is to recite the word to make a sentence to read. What difference does it make that
the child is lost in the individual word if it comes out right? A great deal of difference, he stated. Correct. I said. Now
we're agreeing on something. So what about math? I asked. Math is math. He stated. It's not the same as learning how
to read. Oh isn't it? I asked.

In American schools numbers are taught as to their corresponding value of quantity. Nothing wrong with that, he said.
But there is no relationship taught. I replied. Little Bobby, in his second class of the day might be asked to add two
apples and an orange. How may are there? The answer is two apples and an orange. But he will be instructed to add the
numbers by using comparative graphic depictions of the quantity represented by the numbers. So the answer is 3. When
little Bobby grows up he will continue to use graphic depictions of things that are not comparable to deduce values that
should be comparable but not as he was taught. Add to that, pun intended, the concept of calculation. 

Go on, he said. In very young grades students are taught to add and multiply and divide and subtract quantities. Then
before too long they are permitted to use calculators to accomplish that feat. The end result is the sum. So as long as the
student knows how to pick out the numbers and enter them into the hand held computer device he can use it for tests to
get the sum. You see the Object Based Education principal is the end result. Not how it is achieved. Whom ever
permitted calculators in schools forgot somebody had to come up with the little buggers and if they had not been forced
to use their head to do so there would not be calculators today. Nor would there be full blown computers were you and
your colleagues could muddle over algorithms attempting to artificially imitate a natural process without giving
reference to the cause of the natural process. You only attempt to replicate the result.
No response. I had assumed he would consider the direct attack as venomous. But he did not.

So I continued.... Brain research today boarding on the philosophical concept of epistemology, the study of HOW we
think, ignores the how. It would make a great monkey. Excuse me, he asked?

Monkey see, monkey do. I said. Monkey see speech. Make a program that talks. Monkey see hearing. Make a program
that records sound. Monkey see touch. Make a program that senses a pressure and does something when it reaches it.
Monkey see thinking. Make a program that uses the current case to evaluate with previous similar cases and cause an
outcome to be what worked before and commit to memory the result to be considered again as what did not or did
work. Monkey see intelligence. Monkey add all the above and declare it to be intelligent. Monkey stupid.

You're saying neuroscience is stupid? He asked. No, I said, you are saying neuroscience is stupid. I am saying
neuroscience is a result of the modern way of viewing things. It is a result of the process of observation to the point of
absurdity. It looks at the result of the brain and determines that such a result is caused by such a process that lends itself
to the result.

Vision is another example. How do we see? Do we see in lines that connect and form shapes like the current digital
replication process? Of course we don't. Do we see in the holographic images associated with some research today? Of
course we don't. We see in minute values. But we also see ahead of those values. Say what, he asked?

I'm not talking about soothsayers or psychics. I'm talking about the process of thinking. It observes and it expounds. It
enlarges what it sees past the sight itself. If it didn't you would not be spending great amounts of your fellow
countryman's hard earned tax money on research. You would only be doing search. Research is a projection process.
Search only finds what is. So why then does neuroscience, computer science and the like continue to study in a research
protocol the things that a simple search could identify?

Now I am lost. He said. Not to be, I replied. The reason is that the process that will be used to replicate what is sought
after can only do so in a search mode. It can not research. I think I'm getting it, he said. The digital computer is the goal
on one hand while the goal on the other is the duplication of the process of thinking to understand it better. Together
they require the observer to judge the process based upon it's potential duplication process. So scientists think in digital. 

Which is not a natural process and is not something that it can duplicate it can only replicate. Go on, he said.  Object
Based Reasoning looks at the object observed and declares it to exist. Since it exists it must therefore be caused by
something that lends itself to that goal. So the answer is to look at the goal, break it down into it's obvious parts and
duplicate those parts in a digital environment. Whop! Success. But failure too. Because the result may have been
achieved the HOW was not. Only the HOW TO was. The HOW TO is then declared to be the result and that causes the
scientist to be convinced he is on to something grand when he is actually only on to something bland. Any fool can see.
It takes thinking to see how and why.

So you're saying, he said, that you have some sort of way out of this conundrum? I didn't say that at all, I said. But now
that you've asked, YES. What is it? He asked. Imagine if you will you're precious digital computer. What does it do? It
computes, he replied. Sure it does, but what it actually does is move particles about in an electronic process that shoves
them along their way in the sets of instructions where they come out in a different order that matches the instructions.
OK, you could say that as a simple explanation. He said.

But the brain moves particles from one neuron to the next without the amperage of a computer. How does it move the
particles? Go ahead, he said.....It moves them by their own properties. Remember it is a natural process.  The particle is
observed as both a particle and a wave. So the brain moves the particle by it's own wave. The wave moves the particle
through the process where the particle is but one of many in a cluster that equals a value. That value is changed along
the pathway. 

But how does the brain get ahead of the wave of the particle? He asked. By the process we already have identified, I
replied. The comparison value is being pushed by a higher frequency wave than the input value. In the human there is
yet another higher frequency wave. But if that was the case we would already have identified the frequencies. He stated.
No we would not, I replied. 

Every input receptor, from the single rod or cone of the eye to the hair follicle of the inner ear is running on it's own
frequency wave. Individually they are not detectable but they are detectable when they converge. Harmonics? He
asked. Yes. Harmonics. I replied. The harmonics are strong enough to be observed and we do observe the strongest of
them each time we turn on an EEG. The source of those signals is not found as the source is the entire brain's
convergence of individual frequencies. Those convergences occur greater in some parts of the brain than in others.

But we see, I mean we observe the parts of the brain function for each part of the observed output of the brain. Explain
that one. Easy. I replied. The brain is hardwired. Sections of it are occupied by hardwired connections to the inputs of a
function with the outputs also hardwired. So when you or your colleagues observe the pretty colors of a FMRI machine
you are seeing the activity of the section of hardwired neurons for the outcome you are seeking. But you don't consider
that it has an input. It performs a computational function and it results in an output. 
<BR>.<BR>You are only concerned with the output. So you declare particular areas of the brain to be in control of
emotions, and fear and vision and hearing and you wonder why some areas are active when a person speaks a second
language other than their native tongue. It not only takes the input of the hearing, which outputs to the speech but it
takes memory of the comparison between the two languages to make the second one possible. It is the comparative
memory you are seeing. Nothing abnormal or weird about it.

So why haven't you proven this theory of yours. He asked. It's not a theory. It is proven by the mathematics of the
process of the brain. But as we have learned from this discussion it is not the fact that there is proof available it is the fact
that science today does not consider it an issue. I have mathematical proof and the ability and complete plans to build a
duplicate  of the human brain to physically prove it. It is the hurdle past the perception of the Object Based Reasoning
approach to science that stands in the way. It has stood my way for three years now, ever since I perfected the formula.
Which was not easy I might add. I too had to surpass the Object approach to the brain.

Well I wish you luck. He said. But I still hope you don't underestimate the potential of the digital process. He added.
Fear not my friend. I said. When the great wall of neuroscience finally comes down, which it will when the Neutronics
intelligent computer is built. When someone with the money to accomplish it finally sees the reality of the illusions of
digital and funds the company that will built it and becomes the next Bill Gates and Ross Perot combined, it will fall on
the digital process. 

He only laughed.

By the way, I asked. Who was it that said that 640K should be enough for anybody's needs? Some idiot I presume. He
replied. No. It was Bill Gates. 

Silence.