Top Page | Upper Page | Contents | About This Site | JAPANESE

Language Learning

Within linguistics, the generative grammar school and the cognitive linguistics school each have their own hypothesis of language learning.

There is debate as to which is correct, but on the other hand, I don't think it's a mistake that both of them are triggers for hypotheses. From such a standpoint, this page summarizes what we know so far about the theory of language learning.

Generative grammar

Generative grammar is a hypothesis proposed by Chomsky.

It is hypothesized that humans are born with grammars that form the basis of various languages, and that these grammars become specific languages ??such as English and Japanese.

Based on the idea that "if there is such a grammar, it should be something like this," research is progressing on grammar that can produce various languages.

Cognitive linguistics

Cognitive linguistics believes that humans acquire grammar in the process of cognition , rather than thinking that they have already acquired grammar. Acquisition of language is assumed to proceed in a complex manner from various cognitions such as eyes and ears.

Although it is described as encyclopedic, one word has various uses and aspects. In addition, such words have a network of near and far. I believe that the accumulation of these things through experience also happens in the process of acquiring a language.

Cognitive linguistics includes the idea that human thinking refers to abstract or experienced things. It seems to think that the meaning and language are connected by having this. Also, in thinking about these mechanisms, it seems that the point of "why can we communicate with metaphors?" has become a hint for research.

Before cognitive linguistics, generative semantics was created as something similar to generative grammar theory, but it didn't work well because it assumed a mechanism that directly connects meaning and language.

Relevance to machine learning

Both cognitive linguistics and generative grammar theory have hypotheses about how languages ??are acquired, and they all seem to be explained by different approaches of machine learning .

A Hypothesis of Language Acquisition by Generative Grammar

In generative grammar theory, a simple example is
Y = a * X + b ... generative grammar
Y = 2 * X + 1 ... language A
Y = 5 * X - 4 ... language B
Think about it.

I think of language acquisition as the process of determining this parameter. This way of thinking also makes it possible for humans to learn a particular language without a huge amount of data.

Hypothesis of language acquisition by cognitive linguistics

In cognitive linguistics, the method of finding out "faces" and "dogs" from images is close.

The first step is to deepen your awareness of your surroundings.

Language Learning

In addition to literature related to linguistics such as generative grammar and cognitive linguistics , I will explore philosophy , brain science , artificial intelligence (AI) , applied behavior analysis (ABA) , etc. It's something I put together based on my experience.

Learning word usage

In philosophy , there are studies on "What is a human being?", "What is life?", and "What is 'what is'?"

In that research, there is also a hypothesis that "the meaning of a word is determined by the learning of examples (how to use it)". In other words, when you think about "what is a human being?", you are thinking of a situation in which the word "human being" is used in your own experience, not "human being" like the truth. For this reason, we understand it by saying, "The meaning of a word varies from person to person," and "There is no correct definition."

From a huge amount of examples, we learn, for example, that words such as 'fall' and 'eat' often come after 'apple'. Contextual information becomes available as the meaning of words.

Although I digress , recommendation systems and natural language processing used as artificial intelligence (AI) are technologies that use this hypothesis. Take the information you want from the example of the word.

However, considering the method of language acquisition in Applied Behavior Analysis (ABA) , it is not just "learning from examples", but also learning through things that actually exist. It seems to be good to distinguish this distinction and usage according to whether the word is abstract or concrete. The figure below summarizes this idea.

Grouping and abstraction

If you only learn how to use examples of words, you will not be able to apply them.

There is also grouping and abstraction. If you can create a group that says, ``Apples and oranges seem to be similar'', you will know that there is such a group. In addition, at some point you will learn that the group is called "Fruit".

If you can group them, you can replace the example sentences that used "apple" with "mandarin orange", and you will be able to express your desire to "eat mandarin oranges".

Learning grammar

In cognitive linguistics, grammar seems to be something that can be learned naturally through learning words. For example, grammar is learned based on the understanding that "the word 'o' is often preceded by a word that expresses a name, and is often preceded by a word that expresses movement."

What is grammar

I feel that grammar guides are written on the premise that "a grammar with a perfect system exists".

But first of all, spoken language is often not grammatically correct. If the recording of the meeting is written as it is, it is normal that it is not grammatically correct. I convey what I want to say by adding and adding words.

If you think of grammar as something you learn as you learn how to use words, then grammar is also a collection of examples.

For example, it seems natural to think that "examples that people in Japan often use are summarized as Japanese grammar."

Rethinking "Generative Grammar"

If there is something innate about language, such as generative grammar, it is not an abstract grammar that leads to the grammar of any language, but the mechanism of the human brain that learns and uses words. I think not.



NEXT Logic

Tweet