Linguistic Completeness of Informati

I propose a method for determining the “Linguistic completeness of information” in a given piece of text.Meanings of words are either dependent upon other entities and concepts or not. For example, suppose I use the word ‘gave’. Now, ‘gave’ means transfer of something from one place to another. This invokes the dependence or attachment of thisContinue reading “Linguistic Completeness of Informati”

A way to represent words

This is a way to represent words in language, on a computer.  Take a number. Say 100. I can do logical / computational operations on it, like – taking its half, taking its factorial, its square root etc. But take a word, say, ‘boy’. Can I do operations on it? Can I take (boy)! orContinue reading “A way to represent words”

A super-simple NLP idea

Consider these major parts of speech –  N V Ad Adv Art Prep Conj  7P2 i.e. 42 such permutations. E.g. –  N V V Adv Prep Adv etc. For each pair, write the general, elaborate meaning.  E.g. (Proper N) + (V) – John called Such a meaning of this fragment (pair) is –  ‘A person named NContinue reading “A super-simple NLP idea”

Challenging two of Chomsky’s claims

Chomsky says that language did not evolve for communication, but for internal thought.Lets see how this can be challenged. To begin with, I can make my own language, from the outset. Suppose we didn’t have the English language the way we have it now. Lets construct an English. Here is how it goes – Take any two thingsContinue reading “Challenging two of Chomsky’s claims”

The Definition of Definition

A word is defined in a context in which it is always present. A thief (defined as – one who steals) is defined in the context of the act of stealing, in which it is always present. (Similarly, the stolen thing CAN BE defined in the context of the act of stealing, in which it isContinue reading “The Definition of Definition”

The Heart of Commonsense NLP – a technique to capture the “more”

What lies at the heart of Commonsense NLP is that the meaning of the sentence contains MORE than what is actually stated in the sentence in terms of words. What is this more? Can we capture it? Here is an idea. I) Lets look at this ‘MORE’, elementally. Make all the possible pairs of wordsContinue reading “The Heart of Commonsense NLP – a technique to capture the “more””

Linguistic Commonsense

Let me introduce 2 kinds of verbs – ‘Free’ and ‘Attached’. Attached verbs are those verbs which have a variable in their definition. They are dependent upon an external variable entity for the completion of their Semantics. Free verbs are those which don’t have a variable in their definitions; they are “free” and self-sufficient. ForContinue reading “Linguistic Commonsense”

Flipping concepts in Language like terms in equations

I am introducing a concept in Linguistics, akin to flipping terms in mathematical equations. Consider this equation : P (in terms of T) = nrT/V; whereas now T in terms of P would be T=PV/nR Or, given P=a^(½), a, in terms of P would be : a= P^2 So, given ‘John is in his room’, theContinue reading “Flipping concepts in Language like terms in equations”

First commonsensical comments

What is being referred to here is a casual mode in everyday life, of observing new things. Say you visit your friend’s locality and house and come across various things. What will come to your mind commonsensically, first, upon seeing each of those things? No computer can come up with such things (comments).  PRINCIPLE :Continue reading “First commonsensical comments”

Zeroth law of NLP

The default “direction” of flow of data / knowledge in all written and spoken material is, mostly, FORWARD. Things are moving ahead. They are obviously going from something temporally, before, to something temporally, later. We read ahead in time. First comes something and THEN comes something else and more. We increase data that is coming atContinue reading “Zeroth law of NLP”