I Taught An AI To Write My Book Reviews | RNN Text Generation with Tensorflow

A couple weeks ago, I went to my first hackathon with my friend. (A hackathon is a competition where you have to try to code a finished product within a strict time period, so, like, if coding for 24 hours straight is your idea of fun….)

26 comments

A couple weeks ago, I went to my first hackathon with my friend. (A hackathon is a competition where you have to try to code a finished product within a strict time period, so, like, if coding for 24 hours straight is your idea of fun….)

Anyway, we kind of failed– never finished our project, but we DID go to a bunch of the workshops they offered there. One of them was on machine learning, and after attending it, I decided to try to learn more on my own.

For this, I was using Google Colaboratory, an environment that lets you do machine learning and data analysis stuff on the cloud, and an open-source Python library called Tensorflow, which is used for machine learning.

Tensorflow’s website includes a bunch of tutorial code you can use, so I started from their tutorial code for an RNN text generator.

Advertisements

What is an RNN?

RNN stands for “recurrent neural network”, which is a type of artificial intelligence that looks at patterns in data in order to make predictions. In the case of text generation, what an RNN does is after reading a bunch of example text, it learns the common patterns and then generates its own text by choosing the most likely next letters based on the patterns it’s seen before.

Teaching the RNN to Generate Book Reviews

The first thing I did was copy and paste every book review I’ve ever posted on my blog into one mass plaintext file. Then I uploaded that file into the colab notebook containing the code from the Tensorflow tutorial, and replaced the line that loaded in Tensorflow’s example data with my text file. That way the RNN would be reading my book reviews instead of the Shakespeare play data the tutorial used.

(see this YouTube tutorial I used to learn how to do this)

Then it was time to train the AI!

The RNN “learns” by going through the data set and observing patterns. The number of “epochs” used to train the RNN refers to the number of times the RNN has looked through the entire data set.

Advertisements

10 Epochs

First, I tested out using 10 epochs, and this is what it generated:

As you can see, it’s mostly just gibberish. Evidently the RNN needs some more practice. I’m seeing “book” and “review” appearing a bit, though. Maybe it’s picked up on something.

30 Epochs:

After 30 epochs of training, it’s starting to get somewhere. There are some coherent words and some stuff I guess it’s seen appear often in my reviews. I mean, “cringey, physics or death five stars”, that’s totally representative of me… and I’d totally read a “romance device controversial mystery”

Advertisements

50 Epochs:

At 50 epochs, the RNN was beginning to do something interesting. Basically, it started spitting out strange amalgamations of different sentences I’ve written in different reviews, spliced together. As you can see, it picked up on the three Hunger Games reviews that were in the dataset. I’m not sure where it go the first sentence though.

In an attempt to get it to be more creative, I upped the “temperature” variable– which controls how random the predictions are and how closely it sticks to the data it knows– from 1.0 to 2.0:

Um wow okay
Advertisements

100 Epochs

100 epochs, temperature 1.0. At first, the AI simply spit out some sentences ripped from my reviews, and then got into a bit of a rut where it just began reciting my Monday Mini-Reviews post on Seven Brief Lessons on Physics and The Interstellar Age verbatim.

But then, it started getting a bit more interesting…

It began splicing together my reviews in some pretty entertaining ways. (The combination of sentences ripped from my Project Hail Mary review and my Quiet review led to the assertion that “the book” is about courage and humanity working to save us from ~extroversion~)
Advertisements

Increasing the number of RNN units

The next day, I returned to my laptop as usual and decided to keep trying to get this thing to produce more interesting results. I decided to change the number of RNN units from 1024 to 2048 to see what it would do. The results became a bit more intriguing, but for the most part remained similar.

30 Epochs:

I’m seeing a bit less stolen sentences, but there still are some.

I continued to experiment some more with the temperature and training duration, until I ran out of my free GPU access for the apparent forseeable future. I was able to get it to generate some very INTERESTING book reviews.

Advertisements

Anyway, here are some of the best sentences I was able to generate:

As Malala writes, “We realize the importance of the ‘lic'”

A prominent aspect of the story isn’t going anywhere good.

And darg that’s high stakes right there.

Because he takes such pains to marry a prince.

I also liked the writing that made you want to keep reading.

This book was executed wond… I don’t know. Hm.

fill of just thought. This said this. Then she said this.

Yes, this entire book is basically a lot of boing to Wed.

and guess what happens next…. they fall in love with the characters. It’s equal parts sad and horrifying, but definitely not 5-stars.

Katniss and Peeta have returned home on the book, but it’s possible the whole writing style exacerbated this.

How are you all? I’m just a bit of a little bit of myself, honestly.

Overall, Quiet by Suading is a thing and I had terror

the book is about courage and humanity working together to save us from extroversion

Advertisements

Conclusion:

I don’t think I gave the model enough text to go off of, nor did I really know what I was doing when I was trying to get it to generate text, but all in all it was very fun to play around with. Will there be a sequel post to this? Perhaps. We’ll see.

Next time, I’ll try using more data and doing more research into how to improve the results. Overall, I’d say this experiment was equal parts sad and horrifying, but definitely not 5-stars.

What was your favorite sentence the AI generated? Would you like it if I started doing more CS-oriented posts like this one? Let me know in the comments!

Advertisements

If you liked this post, consider subscribing to Frappes & Fiction. I post about the books I read, the books I think YOU should read, and anything else on my mind.

I’m also on social media!

Advertisements

26 comments on “I Taught An AI To Write My Book Reviews | RNN Text Generation with Tensorflow”

  1. I really have no idea what most of this means (I’m severely tech challenged), but it was so much fun to read! It would be neat to program an AI to write book reviews one day, though I imagine I’d still prefer to write my own. Still, a fun concept and maybe one day you will be able to get an AI to write a real review!

    Liked by 1 person

    1. Haha, thank you for reading! It would certainly be interesting if in the future this technology improved enough to actually produce well-written results. I read somewhere that there already exist machine learning tools that can assist in writing simple newspaper articles and things like that!

      Liked by 1 person

  2. 🤣🤣🤣🤣🤣 This is one of my favorite posts you’ve ever done! I mean, agender Katniss and Peeta in an exciting WWII book featuring cabbage and pelts? I’m basically sold! 😂 I was snorting throughout this entire post, and the quotes at the end nearly killed me. There’s no way I can pick a favorite; they’re all literary masterpieces! 🤩

    Liked by 1 person

  3. Though I had to read this entire post a couple of times to fully understand this (sorry, blame my severe lack of brain cells and also I’m very, umm, un-tech savvy? If that makes sense?) BUT THIS IS SO COOL, I loved it! Teaching an AI to write a book review, honestly if this isn’t one of the best and most unique posts I’ve ever seen!

    Liked by 1 person

  4. What an interesting thought to try and get a computer to write reviews. I have some tech knowledge and I think it’s entirely possible to have a computer write a review, but I don’t have enough knowledge to put it in practice. I love that you tried and got some results! My favorite line was “A prominent aspect of the story isn’t going anywhere good.”

    Liked by 1 person

  5. this was so much fun to read about! It actually taught me a lot about how random text generators work, but was quite amusing to see how it spliced together aspects of your reviews! I really hope you keep working on it, because I’d love for it to give a fully coherent but completely random review!

    Liked by 1 person

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.