Deep N-Grams
Deep N-Grams
This is an exploration of Recurrent Neural Networks (RNN) using trax. We're going to predict the next set of characters in a sentence given the previous characters.
Since this is so long I'm going to break it up into separate posts.
- Loading the Data: Load the data and convert it to tensors
- Generating Data: Create a batch generator for the tensors
- Creating the Model: Create a Gated Recurrent Unit (GRU) model
- Training the Model: Train the model
- Evaluating the Model: Evaluate the model's perplexity
- Generating Sentences: Generate new sentences using the model
First up: - Loading the Data.