https://github.com/aparrish/gen-text-workshop
Example Bots
Camptown Races,
Egress Methods,
Markov Chains
Classic technique in text generation. Since the 1960s.
Ngrams
Basis is n-gram – smarting : [“sm”, “ma”, “ar”, .. “ng”]
https://books.google.com/ngrams
Markov Modeler Pyton
import markov text = open('../text.txt').read() model = markov.model(text, 3) print ''.join(markov.generate(model, 3))
Context Free Grammar
Define the structure of a grammar you want to emulate. Diagram a sentence. Hierarchy and recursion of sentences.
That dog laughed.
That dog over there laughed.
That do I told you about yesterday—the insolent one—laughed.
***Simple grammar*** Sentence => Noun Phrase + Verb Phrase Noun Phrase => "the" + noun Verb Phrase => verb Noun => "dog" Very => "barked"
***Slightly Less Simple Grammar*** Sentence => [Noun Phrase + Verb Phrase] Noun Phrase => ["the" + noun, "the" + Adjective + Noun] Verb Phrase => [Verb, Verb + Noun Phrase] Noun => ["dog", "cat"] Verb => ["barked"] Adjective => ["brown"]