You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I split each chapter of Moby-Dick into sentences, then used a neural network to try to guess what order the sentences should appear in. I call the result Mboy-Dcki.
This is essentially a Markov chain model that works at the level of sentences rather than words or tokens. Such a model cannot be trained directly, so I created a encoder-decoder-type recurrent neural network that takes in the last 25 characters of a sentence and tries to guess what the first 25 characters of the next sentence will be. I then used this network to compute the probabilities for each pair of sentences.
It actually sort of works—at the very least, it picks the right sentence a little more often than chance would dictate. But the point, of course, is in the interesting ways it fails.