Tensorflow LSTM
Data: The model was origionally trained on the PTB dataset (https://corochann.com/penn-tree-bank-ptb-dataset-introduction-1456.html).
Output: When trained, the model can produce text, generating characters as opposed to words. After training on the PTB dataset, I trained the model on the works of Shakespeare to produce origional plays.
Sample Output (Accuracy = 97.265%):
d on the capital of indianalonss hecturtor o ia to rontsit r us>r od tisitssnosnma oraonaosonheinge o so ro t ro ce tocn to nons foltsr irannhe tan fa nes oote tor fissototsone sosss te tosno nenod ge nstto isunr's 's nan focoupits f onofa nore te sung if tte onns hand to s nhe taninwn n's tfws the <one twns 'ocofl a fe towwnessotpcons mots the ceng anet onet the rimy nhe turion snor of monn iatnss <nns fon foarsnlsena oonito onssofasnor cnsr rantis the delk de to te guuns soronts n'n s nn>nonil usggof mhr fol of s io n oo issun ts n ne tunk> itssotis ofn lonr likts oo t on r me nanononit es nurof saota iis 's mondfoa nos sonstdenn sof stts not inso mastor nur fosto m sk to mnng annftpinnsste tuning twn shansst to tor o rinn owssofor nge nhnn nanitsot is nke <onr sunestth sstsof wwnsfoonsogins aorn m onl oyansike dens t oo nh o nho <efis rond ti os to nu kh onnsh tustnsta tane nosn'sctur mha ronl oorite nossofa tua orandesif neot s i> tden cacion eonus 'hannoriasiteitonece nororott s nn tont ioinonofennhe not oons o ot 'ssod ot ng o nke thes nul oreo nasonnssiilsonis oor none cnns fa o soon aoor fos 'aonotsofuck rurgd deen ne tunkded nis no toppatisito sonis onreaimor o teit gn rutkr orpnr t ng nutiteotonstge tsin hor tung ofitos i se no ownr gond twned tend nenestt sonnstiisnhe ouna to i oo ooss te for o ininon nh s la te ta gocs natiano turoff ioce nhre nate benn eennriofinsothins menndnnrswnhe o rigedrr nssfop ith f ri tond nondinhos mhat 'to tosigon uosafot corc sons to te non hong mantra nhs or sottstofwnsto noso nasifu nesporis onna ones ohe deos 'i beons torstfeiforna cinn <une sollr mecnresonositgi forisn orwns totpntlid be oonsfo t ou ihof mnt ton the delm tritwnnstn'ssthisofhan 's gor the nol o oy o ite nos swne twnsos nasiraitssn isifus ooa tif oonoceondeo ke tott teet so r be thanufgestn he ne t son na < cu tornati t nu nunefaanufo s sgon to nr ihe ina onnnofonss guft sornnd tosts soss le turins nos the funn tertnws ih norirrofrwnsnn s or the rkke teed niseonacounis onns <its con sianit inis fn sanutt o nes ofcofopyio oos gocsfta onss fean thet dhit 'uin tawnsn oonmnoonns gons to i tion ns n ts nuntf rino onnas oo son sooc l orepth annh uunk tatruthor nk> nons ton tisgofs tonn t's th ofa aoty osr'sosonoota tos oaronke onrce that he n nk tecnus wure trat th snns toceuch t ir nreto io ihin fo none torennss <eat it s natgon fooci gool to if honnfsonunwony tendoli ihins gonnf oo r ot < on ta nots fons aatn t ring <gnr t esonute n re n na won ios niso t sof ns <onk
Notes: The model did learn to partition characters into words, although words often had no meaning. The model was relatively small though and only trained for 1.6 epochs. The accuracy of the model could be improved by predicting words as opposed to characters.