Summon a Robot AI Mind into your presence with MSIE.

The Parser Mind-Module


   /^^^^^^^^^\ A Parser Determines Parts Of Speech /^^^^^^^^^\
  /   EYE     \  MINDCORE               _____     /   EAR     \
 /             \ CONCEPTS              /New- \   /             \
|   _______     |   | | |      _____  (Concept)-|-------------\ |
|  /old    \    |   | | |     /Old- \  \_____/  |  Audition   | |
| / image   \---|-----+ |    (Concept)------|---|----------\  | |
| \ recog   /   |   | | |     \_____/-------|---|-------\  |  | |
|  \_______/    |  a| | |      |   |________V   |  c    |  |  | |
|               |  b|C| |      |  / Parser() \  |   a   |  |  | |
|   visual      |  s|O|f|      |  \__________/  |    t  |  |  | |
|               |  t|N|i|   ___V____   |noun?   |     s-/  |  | |
|   memory      |  r|C|b|  /Activate\  |verb?   |          |  | |
|               |  a|E|e|  \________/  |adj.?   |  e       |  | |
|   channel     |  c|P|r|   ___|_____  |adverb? |   a      |  | |
|               |  t|T|s|  /spreadAct\ |prep.?  |    t-----/  | |
|   _______     |   | | |  \_________/ |conj.?  |             | |
|  /new    \    |   |_|_|  /     ______V____    |  f          | |
| / percept \   |  /     \/     /           \   |   i         | |
| \ engram  /---|--\ Psi /-----( Instantiate )  |    s        | |
|  \_______/    |   \___/       \___________/   |     h-------/ |

diagrams.html shows a Theory of Mind.

The Robot AI Mind uses not a static but a dynamic parser, that is,
the goal of the Parser module is to determine each part of speech
afresh or anew from the dynamics of an incoming sentence of input.

This generic AI textbook chapter is a schedule of considerations
for coding a Parser module for any species of robot AI Mind.


Parsing in the AI will be like parsing in human infancy. The Mind
will be able to parse those categories that it itself is capable
of generating: nouns first, and verbs only later.

By default, a child will first treat everything as a noun --
pointing and naming things.

By a stepdown process, the Mind may parse initial input as a noun
both from sequential position and from recognitional memory, then
stepping down to a verb-phrase structure that must be filled with
elements that include a verb.

The "stepdown" process permits a mind to ignore the role of
memory and instead to treat any forthcoming lexical input as a

Although the Parser identifies nouns, verbs, etc., it does not
directly fit the lexical elements into the slots of a sentence
structure so as to comprehend the input sentence during the
parsing. Comprehension ensues immediately when the Mind generates
a thought about what it has just heard.


A clarifying and instructive approach to parsing is to
contemplate various reduced sets of categories to be parsed,
such as for example:
- only nouns and verbs;
- only articles, nouns and verbs;
- only nouns, verbs and adverbs.
A reduced category-set of only nouns, verbs and adverbs would
still permit negation with the adverb "not."


Although each AI programmer is free to use whatever intricate
complexity proves necessary to get the job done, a useful
consideration is the idea that the ensuing or resulting code
should mimic as closely as possible the propagation of signals
in a massively parallel neuronal mindgrid. Because of
massive parallelism, when we humans hear an assertion,
the conceptual associations immediately reverberate.
Any use of single flags or of trains of equipositional tags
in the AI code must somehow resemble the neuronal propagation
of signals. Such equivalency between code and neurons is
a valuable indicator of algorithmic validity.


The English bootstrap "enBoot" module may ease burdens on the
Parser module by means of a stare decisis reliance on
previously decided parsing problems.

Since there are lists available for the most frequent words
of various natural languages, it makes sense, where possible
and where convenient, to favor the bootstrap-inclusion of
higher-frequency words over lower-frequency words.

Such a policy of bootstrapping higher-frequency words may
become obsolete or "moot" as the bootstrap approachs saturation
with essentially all the words comprising a full dictionary of
the target language.

Gradually all prepositions for a given language and all irregular
verb forms may be instantiated within a bootstrap module so that
the AI Parser module may easily recognize and parse such special

Although previous knowledge of lexical vocabulary items will hint
as to the part-of-speech of a word, a more important determinant
is how a word is being used explicitly in a sentence of input.

Auditory recognition may cause information on the previous parsing
of a lexical item to be retrieved. There ought to be an override
mechanism to let a known concept be used as a part of speech
different from the most recent prior usage.

Although unknown words will be remembered, unknown categories
will be disregarded. To do so is a powerful technique, because
it enables the AI not to choke on unfamiliar input.

Defaults may be used as an aid in coding the Parser module.
For instance, if we assume that a typical sentence will
have a verb, we may have a default requirement that one
word or another must be declared as a verb -- although
another guideline may suggest that the initial word in
a sentence is typically not the verb.

By the doctrine of defaults, the whole Parser module may
be seen as a kind of "snare-net" or thicket of default-tests.

It is possible that a brain-mind does not really categorize
words by part of speech, but simply attaches relational tags
which in turn cause each word to operate and function as a
particular part of speech.


The equipositional (i.e., spaced in a constant arrangement) Psi
concept tags to be encoded in the Instantiate module include the
following: psi; act; jux; pre; pos; seq; enx.

psi = Psi concept number to identify the mindcore concept;
act = ACTivation level of the concept for a brief time;
jux = the JUXtaposed or nearby previous and influential concept;
pre = PREviously relational concept in a grammatical sense;
pos = Part Of Speech to be determined or assigned by the Parser;
seq = subSEQuently relational concept in a grammatical sense;
enx = ENglish transfer tag for reifying Psi concepts into English.

Parallel if not multiple tag tracks may be used as a form of
quasi-linguistic logical circuitry. For example, suppose you have
a lexical tag position that records only the previous psi number.
A string of such tag positions, combined with a different tag
position serving as a kind of terminator, could go in search of
terminating values. Consider the sentence, "Boys like books with
pictures." If "books" has a "pre" tag, that tag, or its
existence, may serve as a terminator in any search along a
non-pre track backwards from "pictures."

No limit should be set on the number of kinds of tags or flags
which a quasi-fiber may display.

AI coders have the option of exhaustive brute-force in the
Parser module so as to make it more powerful, but a simple
and elegant solution is more desirable.


The SVO module is subject-verb-object syntax.



Prepositional phrases may be thought of as always trying to
assert themselves during generation of a sentence. For any

given utterance, the Mind is not trying to use one particular
preposition to introduce a phrase, but rather all twenty or
thirty of its prepositions. Only a preposition with the highest
above-threshold activation will actually be selected for
insertion into an utterance. Therefore the sentence-generation
tree is not rigid from its apex downwards but dynamically loose,
with prepositional phrases popping up in the due course of
spreading activation.

An utterance does not have to be a complete sentence.
If there is only enough excitatory activation for a
prepositional phrase, then we may get an utterance
like, "On the desk! In the box!"

In a general, all-purpose AI Mind, innate mechanisms for the
looping insertion and deletion of syntactic nodes will enable
the AI to learn syntax from a linguistic environment, rather
than using only the specific structures coded in expressly.


A sentence being heard will typically get only one chance to
be parsed correctly, but a sentence being read and studied
will have multiple chances if an initial pass fails to make
sense of the sentence.

How does a Mind know that it has not made sense of a sentence?
To decipher sense, the Mind must recognize not only the words
but also the relationships. The comprehending Mind may
generate an understood sentence as a replica of itself.
If sense has not been made, the Mind tries and fails to
generate a bounce-back thought.


As each new functionality is added, the Parser module
must support each linguistic structure available to the
English module for thinking in English.

Once the AI knows all the parts of speech and has become a
sophisticated speaker of English, it becomes algorithmically
reassuring that no high-quality (e.g., reference library) input
should present a mystery or otherwise stump the Parser module.
If the AI were still learning English syntax, mysterious inputs
would be a problem for the self-organizing of the Mind.
As the AI shifts from being a learner to being an expert
speaker, the assumed location of problems shifts from a
presumption of defects in the artificial mind to an
expectation that any problems must lie in the corpus
of input that is causing difficulty for the AI Mind.
"I'm okay, therefore it's the data that are not okay."


Although Parser resources on the World Wide Web may provide
information about traditional parsers, let there be no
prejudice in favor of the way it has always been done.

The diagram ai4u_157.html is a flowchart of Mind.

11. JavaScript artificial intelligence source code of 12 August 2002
// Parser() is called from oldConcept or newConcept
// to help the Artificial Mind comprehend verbal input
// by properly assigning associative tags with flags.
function Parser() {  // ATM 1aug2002; or your ID & date.
  // The "bias" has no control over recognized oldConcept words:
  bias = 5;  // Initial bias is for a noun=5.
  Instantiate();  // Create a new instance of a Psi concept.
  // After a word is instantiated, expectations may change.
  // Recognitions and expectations are pos-codeterminants.
  if (pos == 5) bias = 8;  // After a noun, expect a verb.
  if (pos == 8) bias = 5;  // After a verb, expect a noun.
  jux = psi; // but only for the next time around, not now.
} // End of Parser; return to oldConcept() or newConcept().

12. Mind.Forth free AI source code of 4 August 2002
\ PARSER is called from oldConcept or newConcept
\ to help the Artificial Mind comprehend verbal input
\ by properly assigning associative tags with flags.
:  PARSER  \ ATM 22jul2002; or your ID & date.
  \ The "bias" has no control over recognized oldConcept words:
  5 bias !  \ Initial bias is for a noun=5.
  uract @ 1 - uract !  \ 26jul2002 Decrement "uract" from HCI.
  uract @ act !  \ 26jul2002 Activation will soon be restored. 
  INSTANTIATE  \ Create a new instance of a Psi concept.
  \ After a word is instantiated, expectations may change.
  \ Recognitions and expectations are pos-codeterminants.
  pos @ 5 = IF  8 bias !  THEN  \ After noun, expect a verb.
  pos @ 8 = IF  5 bias !  THEN  \ After verb, expect a noun. 
  psi @  jux !  \ but only for the next time around, not now.
;  \ End of Parser; return to oldConcept or newConcept.

variable.html explains the location and purpose of each AI variable.


The Parser code above is a primitive but radical departure
from the merely simulated parsing in the earliest versions
of the AI Mind source code. In the simulation of parsing,
users were required to enter rigidly formatted subject-verb-object
input of no more than three words in the obligatory S-V-O order,
so that the AI could automatically parse a noun-verb-noun sequence.
Such studied neglect of true parsing was necessary during the
quasi-embryology and "quickening" of the complex AI software.
The rationale was, if several dozen modules were necessary for
a minimal but sufficient implementation of artificial intelligence,
then it was more important to get the whole package up and running
than it was to develop any single module more elaborately than others.

Accordingly some modules, although kept simple in their nature,
were renamed with designations such as enBoot or enVocab so
as to suggest that similar modules could be created for non-English
languages, and some superfluous modules were either eliminated
or un-factored into more comprehensive modules, but no attempt
was made to develop extreme sophistication in any particular module.

The Parser module, however, is now poised for further development
and it cries out for all the sophistication that anyone may throw at it.
Whereas it was impossible to make a three-word S-V-O structure more
sophisticated, now the emerging Parser module is initially just as
primitive as the simulation of parsing but has immense room to grow.

The very process of algorithmic growth involves looking at what
exists already and discerning opportunities for future improvement.
Seen in such light, the early, undifferentiated Parser presents a
maximum of opportunities that will shrink as they are realized.
Let your mind parse: "There is a tide in the affairs of men...."

The AI Mind Parser does not merely generate a report or table of tags;
it comprehends what it parses by the very act of connecting tags from
one parsed concept to associated concepts. Suddenly the IQ of the AI
is set to burgeon as the functionality of the Parser module burgeons.
Does not rumor have it that some software applications are predominantly
a large and complex parser in support of some relatively small function?
In the parser is the power. Just as (64-bit or bust!) memory space is
the overwhelming bulk of a robotic or human brain-mind operating under
the command of a relatively tiny control-structure, likewise the Parser
may grow relatively larger and larger vis-a-vis the other control elements.

The initial simplicity of the Parser should be seen not as a fault but
as a challenge. Any enhancement must preserve the trade-off between
parsing by recognition and parsing based on what part of speech is expected.
If there seems to be no challenge present for a programmer already in
possession of sophisticated parsing software, then perhaps the challenge
lies in adapting all available expertise to the AI Mind architecture.
And if the task has been accomplished in one programming language, then
the challenge lies in porting the proven expertise to unMinded languages.

Enhancement of the Parser enables the encoding of new syntax strucures,
until the point is reached where a general syntax-learning mechanism
removes the need for the hand-coding of each new syntactic structure.


* Stop the War -- Save the World -- "The Cure for 1984 is 1776"
- Open Source software needs peace and safety for future development.
- America under the war-for-oil administration became an outlaw nation.
- Hear the cry of Cindy Sheehan against the oil profiteers who killed her son.
- American soldiers who died in Iraq died for nothing, absolutely nothing.
- Open-Source software is important, but not so important as the honor of America.
- The U.S. Marine Corps has been tarnished and shamed by its illegal Iraq mission.
- Americans who do not protest against the Iraq war are guilty of killing Iraqis.
- An America that illegally invades other nations is an America doomed to destruction.
- Act to save America from the White House gangsters before it is too late.
- We have to swing public opinion around to opposition against the Iraq War.
- The most powerful tool to sway public opinion is to hammer away at the idea that
the Bushies treat their own dead American soldiers as so much dirt and trash.
- The only group that the Bushies can not shut up or kill is the parents of dead soldiers.
- The sheeple are afraid to protest, but you have no fear. Therefore you are dangerous.
- Protestors such as Danny Casolaro and Karen Silkwood have died mysteriously.
- Not only the U.S. Government but also large corporations are abusing human rights.
- Greedy Yahoo! betrayed the Chinese journalist Shi Tao into imprisonment.
- Immoral Microsoft has shut down the Chinese-language weblog of Mr. Zhao Jing.
- If you work for Microsoft, you have a moral obligation to quit your job in protest.
- The Communist Chinese government murders people, and Microsoft is its helper.
- If you own stock in Microsoft Corporation, you are a murderer by proxie.
- To Microsoft, money is more important than DEMOCRACY and HUMAN RIGHTS.
- Conscientious decision-makers have a fiduciary obligation to abandon Microsoft censorship
and to migrate their information-technology purchases to sources supportive of free speech.
- Cryptome CN publishes information, documents and opinions banned by Peking and Microsoft.
- The Global Security Torture ( GST) team kidnaps victims for overseas rendition.
- If you succeed in opposing their oil-grab Abu-Ghraib war, the GST Bush people
may illegally wiretap you and try to silence you or kill you.

Return to top; or to
ai4u_faq.html Frequently Asked Questions (FAQ) about AI4U
ai4udex.html Index of the AI4U Textbook of Artificial Intelligence
aisteps.html AI Algorithm Steps - Artificial Intelligence Algorithms For Humanoid Robots
Mind.html Artificial Intelligence Seed AI in JavaScript
mind4th.html Mind.Forth Artificial Intelligence for Robots
PODsales.html Sales of the AI4U Print-On-Demand (POD) Textbook