NTC ORANGES AND PLUMBS

December 14, 2019 0
(C)1996 Lee Kent Hempfling All Rights Reserved


          They are different. In every aspect of their make up they are different. No one would confuse a plumb
for an orange. Too many differences. But if all you knew was the plumb the teacher would use it to describe the orange.
Both are just about the same shape, but different size. Both are just about the same texture inside but are covered with a
different skin. The skin is a different color but it's still comparable to the other as it is skin. Both are fruits but neither is the
same fruit. So it is with the method used in this document to describe the quantum 5th generation computational
process. It has been compared to the digital process to make it easier to understand. But if you had grown up with
nothing but the plumb an orange would always be compared to it each time you ate it regardless of whether you liked it
better.
          Another way of looking at the enlightenment process is accomplished when one considers the child
who grew up in a red room. Ever since birth the child was confined to the room. Everything in the room was red. So
everything to the child would be red. Not because the child wasn't able to see other colors just because the child was
never presented with another color and didn't know any better.
          What happened when the child was finally permitted to leave the red room? The child refused to
believe the overwhelming evidence that millions of other colors exist. The child resisted the knowledge. The child
returned to the red room and there he stayed. Afraid to come out as the red room not only offered a known
environment it offered a safe environment.
          Once forced to remain outside of the red room the child developed anxiety over the differences
between his acquired perception of color and the reality of the real world.  Would it have been more humane to leave
the child in the red room? No. It was inhumane to limit the child to the red room in the first place. Humans are a product
of what goes in . That product's consistent reinforcement sets up a norm that is used by the brain to compare to all future
input. Making the reinforced input the basis for all other input to be judged. 
          Before releasing the child into the world the step needed to have been taken to slowly and
methodically introduce additional colors without direct notice of the child so a new reinforced norm can be developed.
Then, opening the doors to the world at large results in comparison to things that already make sense and the anxiety no
longer exists. The refusal to accept reality no longer exists.
          In that process taking ‘baby steps' as Bob's psychiatrist called them in the movie is required for
advancement. How then does something so new and so foreign to current accepted norm become understood? Slowly
and with a method of instruction by comparison to existing technology. But the 5th generation system actually has only
one basic comparison to digital processing. Data goes in and data comes out. That's it. Nothing in between is the same.
          In order to fully understand the differences the method of the current norm's acquisition has to be
understood. This is perhaps the easiest part to explain and the most difficult to grasp. Once something has been
accepted for a long time the reason for it's conception becomes lost in it's implementation. After a while the method of
acquisition is no longer needed to make new comparisons and is ‘forgotten.'
          In a world comprised of grey areas the most simple was accepted to denote absolutes. Either
something was on or it was not. A switch makes things either on or off and a switch is an easy thing to make. Either one
asks a question or one doesn't. Either one answers the question or one doesn't. It was never considered that the answer
is the sought after end not whether it was presented or not. Accuracy and efficiency were never considered until the
process of making absolute judgements was in place. Either an instruction was present or an instruction was not present.
What the instruction intended was then able to be broken down into smaller increments of yes-no and the result is the
programming process of today's digital computers.
          It is understandable how this took place. The computer was designed for the purpose of making
number solutions. For the purpose of counting numbers. The numbers were then extended to represent letters and the
arrangement of those represented letters resulted in perceived concepts. There are no concepts in a digital process but
the relationship from one number group to another after translation to their representational letters resulted in a concept
being understood or conveyed. That is where the confusion set in.
          All of a sudden it was no longer considered that a device would first have to accept a concept before
it could make a number judgement or such number judgement would be nothing but an absolute without meaning. The
meaning established by the user and not the machine doing the computation.
          From that beginning it was necessary to increase the speed of computation so the user's perception of
a concept would be more closely matched by the machine's outpouring of the parts of that concept. The faster the
comparative relationship became the more the user began to confuse the machine and his own perception. From that
misunderstanding came the concept that a totally ignorant device could produce coherence without ever having any of
it's own.
          Ever since the introduction of the digital process scientists have made comparisons to the brain. But
there is only one comparison. Data goes in and data comes out. It is what happens in between that sets up the
difference. The brain deals in absolutes of variable values. It makes comparisons between absolutes and arrives at a
mixture of both. It is that mixture of both that is then further compared to other mixtures and results in recognition of
partial matches instead of recognition of a giant collection of absolutes.
          The digital computer does it's computations based on single instructions. Each single instruction is
combined with other single instructions and a multiple output is displayed on a screen that as observed blends together
to result in a single output. When in fact it is not. There is no relationship between one pixel and another in a graphic
depiction. They are all results of a fast process of addition and subtraction.
          The brain on the other hand adds and divides. The subtraction process occurs in the comparison of
two absolute values and results in a division. This is the same process observed as entropy. Each value of the brain's
process is one step smaller than it's previous value as it relates to the value it is compared to. It is this reduction that
places memory in a time relationship. The older the memory the smaller and less ‘recallable' it is. But should a memory
be consistently reinforced (such as in the concentrated training process of repetition) the old memory remains old but the
newer memory continues the older memory's values and compares it with more new input making comprehension of
concepts ever that more possible.
          In the digital computer the memory is solo. It has no relationship to any other memory. The memory
of a byte representing the letter Q is compared to a Q and results in a yes. The comparison of a Q to an R will result in a
NO. In the brain the memory of a Q compares with it's shape, it's previous use, it's being a symbol of communication,
it's sound as a letter, it's motor motivation to construct a Q, it's most recent and more supported use. In the brain the
value of a Q in memory indicating the sound of the letter may equal the  value of some other totally non relationship
based input. Hypothetically establishing a reference value of .00005 volts to the value of the sound of the letter Q will
result in it's comparison with all other sound value within the scope of the sound value search. It will directly compare to
another value of .00005 volts. That other value will not even be a letter sound. It could be the sound of a portion of a
note from a trumpet. Things that do not outwardly compare do compare inwardly. And it is that inwardly comparison
that sets up the correlation between non comparative things. Thus making the external input of the device as it is acted
upon by the output of the device bound together. 
          As a thinking human if you had to stop each time you heard the sound of the letter Q and evaluate it's
potential as the letter Q and totally disregarded it's value as the sound of anything else then you would have to stop and
evaluate the sound all over again if you heard the same value from a trumpet. If that was the case your brain would never
amount to any outward action unless your comparison process was increased in speed to a  level where a biological
clock could not even attempt to regulate it.      
          Digital computers act this way. They don't seem to as they are performing their single non relatable
tasks at speeds far exceeding yours. They accomplish that by counting and adding frequencies. The biological clock can
only establish a value and act upon reduced levels of it. See ‘The Biological Clock' for more information on this function.
          As it may seem extremely slow to function a quantum computational device at just 35kz maximum
frequency it is actually exponentially faster than the digital computer's  frequency of 133 MHZ. The quantum computer's
output is relational to itself. The digital computer's output is only relational to the observer of the output and only as
efficient and understandable if the speed far exceeds the observer's.
          The quantum computer's output is instant as the frequency of 35kz is used to compute by setting up
increasing wave lengths from divisions of the maximum frequency as compared to the input. The digital computer
functions at the same speed throughout it's computational process and is delayed in it's output by the amount of shared
tasks it must perform and the complexity of the programming instructions it must complete.
          The quantum computer has no delay. It's output is fully computed and referenced and committed to
memory for the next input's use at the moment it is inputted. Real time. Controlled only by the speed of travel of the
free electron through it's pathway connections. Approaching two thirds the speed of light.
          Where the digital computer must use multiple processors each sharing tasks to combine them to make
a parallel system the quantum computer is a parallel system that uses dedicated processors for each input joining them
in the output stage (see limbic system) to result in correlated and synchronous binding. From the observation of a single
concept to the output of a single action many thousands of parallel computations must take place.
          In the paper "Why Digital Computers Will Never Be Intelligent" reference is made to these differences.
After having presented the paper to numerous neurological scientists and computer AI programmers the only statement
that could be received was what was given. "Of course, I disagree with you." Of course they would. They are so
reinforced in the power of the digital process that they fail to see the obvious. 
          Nothing bound together can come from an unbound system. If there is no binding there can never be
binding. If there is no order there can never be disorder, let alone additional order. It is this finding that further makes
this discovery important. We know the velocity of the particle since we establish it by the frequency of the wave it is
propelled on. We know the location of the particle by which step (wave) in the process it is synchronized to. If that was
not an integral part of this process one would never be able to know for certain if any memory value was in memory or
was still being processed in comparator functions.
          Once Neutronics Technologies Corporation establishes a commanding lead in the computational field
it will begin to examine applications of the formula in other venues. Applications extend from particle physics to
cosmology. From chemistry to metallurgy. From botany to biophysics. 
          NTC will command the lead in not only the next generation of computers but in the next scientific step
past the DNA hurdle. DNA and RNA function in the same method of controlled entropy as the computational process. It
is just as easy to envision a chemical reactive process of mental computation being reduced to it's particle state as it is to
envision the chemical reactive process of A C T G chemicals in DNA in it's computational process.
          But the difficulty of that step will have to take the devoted explanation and expense of research
required of it and once again the explanations based in a little at a time and comparisons to existing understanding
before it will become a viable laboratory issue.
          When the position and velocity of a particle is understood there is not much standing in the way to
utilize that knowledge. The only thing that does stand in the way is acceptance of the concept. 
          NTC will prove it by building the machine that duplicates the process performed by the living brain.

          CONCLUSION

          The amount of material contained in this process is so immense that the mere public receipt of it
would cause others to attempt to utilize it's methods. I can not stress enough the importance of this project. Every
precaution will be taken to secure the laboratory and to offer security for it's inventions.
          It's not the first time in history a company will be formed to reach a goal that has only been dreamed
of before. But it is the first time in history where the company can actually with it's own formula, control it's future.
          The investor who chooses to back this firm's creation and development will receive rewards beyond
imagination.