Created: September 4, 2015 / Updated: December 12, 2016 / Status: in progress / 4 min read (~734 words)
- How did language emerge?
- Structured vs unstructured languages
- Text summarization techniques
- Text understanding
- Text generation
- Determine if two texts are the same or similar
- Char-RNN (https://github.com/karpathy/char-rnn)
- How to properly classify each type of sentence?
Being able to understand natural language is an important step in building something one might call intelligent. Language is first and foremost how individual communicates with one another using a common syntax.
Some natural languages are not able to express certain concepts at all while others can greatly define that same concept. From this, one will obviously ask: is there such a thing as a language that can express everything?
Reading a word is basically triggering letter by letter a sub neural network multiple times until the appropriate word is triggered
For instance, reading "word" would trigger all words with the letter "w", then "wo", then "wor" and finally "word". As the same sub neural network gets activated multiple times, its residual activation keeps increasing as more and more letters of the word are read.
- Hand/hard coding of large sets of rules (if/then)
- Based on statistical machine learning
- Generally grounded in statistical inference
- Analysis of large corpora of real-world examples
- Supervised learning using tagged corpus