Elon Musk AI Text Generator con LSTM en Tensorflow 2

Contents

Introduction

Elon Musk has become an internet sensation in recent years, with their views on the future, his fun personality and passion for technology. By now everyone knows him, Either like that type of electric car or like that guy who builds flamethrowers. He is mostly active on his Twitter, where you share everything, Even memes!

1icw_glmzjyfk_g4qb39bzq-2217409

He inspires many young people in the IT industry, and i wanted to do a fun little project, where you would create an AI that would generate text based on your previous posts on Twitter. I wanted to sum up her style and see what kind of weird results she would get.

Preparation

The data I'm using was taken directly from Elon Musk's twitter, both from your posts and from your responses. You can download the dataset in this Link.

Importing the libraries:

Now I am going to create the function that will remove all the links, the hashtags, the labels and all the things that will confuse the model so that we are left with clean text.

Let's define a tokenizer and apply it to the text. This is how we are mapping all the words into their numerical representations. We do that because neural networks cannot accept strings. If you are new to it, there is a great series on Youtube by Lawrence Moroney, that I suggest you check below:

Now we will have to define max_length (all data must be padded to a fixed length, as with Convolutions), and we also need to convert input_sequences to numpy array.

We are going to create data sequences, where we will use all the elements except the last one as our X, and the last element as the y, of our data. What's more, ours and is a unique representation of total_words, which can sometimes be a large amount of data (if total_words is 5952, that means that each and has the shape (5952,))

Model

Below is the configuration of our model.

Tried a couple of optimizers and found Adam works best for this example. Let's build and run the model:

Let's create a 'for loop', which will generate new text, based on seed_text and the number of words we will define. This part of the code can seem a bit intimidating, but once you read each line carefully, you will see that we have already done something similar before.

Now is the time to play with our model. Wow!


Summary

Space is a great combination of cats !? Who would have known! As you can see, the results that the model gives are silly and do not make much sense. As with all deep learning models, there are many things that could be modified to generate better results. I leave it to you.

Subscribe to our Newsletter

We will not send you SPAM mail. We hate it as much as you.