A couple weeks ago, I went to my first hackathon with my friend. (A hackathon is a competition where you have to try to code a finished product within a strict time period, so, like, if coding for 24 hours straight is your idea of fun….)
Anyway, we kind of failed– never finished our project, but we DID go to a bunch of the workshops they offered there. One of them was on machine learning, and after attending it, I decided to try to learn more on my own.
For this, I was using Google Colaboratory, an environment that lets you do machine learning and data analysis stuff on the cloud, and an open-source Python library called Tensorflow, which is used for machine learning.
Tensorflow’s website includes a bunch of tutorial code you can use, so I started from their tutorial code for an RNN text generator.
What is an RNN?
RNN stands for “recurrent neural network”, which is a type of artificial intelligence that looks at patterns in data in order to make predictions. In the case of text generation, what an RNN does is after reading a bunch of example text, it learns the common patterns and then generates its own text by choosing the most likely next letters based on the patterns it’s seen before.
Teaching the RNN to Generate Book Reviews
The first thing I did was copy and paste every book review I’ve ever posted on my blog into one mass plaintext file. Then I uploaded that file into the colab notebook containing the code from the Tensorflow tutorial, and replaced the line that loaded in Tensorflow’s example data with my text file. That way the RNN would be reading my book reviews instead of the Shakespeare play data the tutorial used.
(see this YouTube tutorial I used to learn how to do this)
Then it was time to train the AI!
The RNN “learns” by going through the data set and observing patterns. The number of “epochs” used to train the RNN refers to the number of times the RNN has looked through the entire data set.
First, I tested out using 10 epochs, and this is what it generated:
After 30 epochs of training, it’s starting to get somewhere. There are some coherent words and some stuff I guess it’s seen appear often in my reviews. I mean, “cringey, physics or death five stars”, that’s totally representative of me… and I’d totally read a “romance device controversial mystery”
In an attempt to get it to be more creative, I upped the “temperature” variable– which controls how random the predictions are and how closely it sticks to the data it knows– from 1.0 to 2.0:
100 epochs, temperature 1.0. At first, the AI simply spit out some sentences ripped from my reviews, and then got into a bit of a rut where it just began reciting my Monday Mini-Reviews post on Seven Brief Lessons on Physics and The Interstellar Age verbatim.
But then, it started getting a bit more interesting…
Increasing the number of RNN units
The next day, I returned to my laptop as usual and decided to keep trying to get this thing to produce more interesting results. I decided to change the number of RNN units from 1024 to 2048 to see what it would do. The results became a bit more intriguing, but for the most part remained similar.
I’m seeing a bit less stolen sentences, but there still are some.
I continued to experiment some more with the temperature and training duration, until I ran out of my free GPU access for the apparent forseeable future. I was able to get it to generate some very INTERESTING book reviews.
Anyway, here are some of the best sentences I was able to generate:
As Malala writes, “We realize the importance of the ‘lic'”
A prominent aspect of the story isn’t going anywhere good.
And darg that’s high stakes right there.
Because he takes such pains to marry a prince.
I also liked the writing that made you want to keep reading.
This book was executed wond… I don’t know. Hm.
fill of just thought. This said this. Then she said this.
Yes, this entire book is basically a lot of boing to Wed.
and guess what happens next…. they fall in love with the characters. It’s equal parts sad and horrifying, but definitely not 5-stars.
Katniss and Peeta have returned home on the book, but it’s possible the whole writing style exacerbated this.
How are you all? I’m just a bit of a little bit of myself, honestly.
Overall, Quiet by Suading is a thing and I had terror
the book is about courage and humanity working together to save us from extroversion
I don’t think I gave the model enough text to go off of, nor did I really know what I was doing when I was trying to get it to generate text, but all in all it was very fun to play around with. Will there be a sequel post to this? Perhaps. We’ll see.
Next time, I’ll try using more data and doing more research into how to improve the results. Overall, I’d say this experiment was equal parts sad and horrifying, but definitely not 5-stars.
What was your favorite sentence the AI generated? Would you like it if I started doing more CS-oriented posts like this one? Let me know in the comments!
If you liked this post, consider subscribing to Frappes & Fiction. I post about the books I read, the books I think YOU should read, and anything else on my mind.
I’m also on social media!
Watch my Instagram reel on this (#voicereveal)