Skip to content

A personalized autocomplete (next word prediction) project using three different architectures: stacked LSTMs, Seq2Seq with Attention and LSTMs and GPT-2, written from scratch.

Notifications You must be signed in to change notification settings

RisticDjordje/personalized-autocomplete-next-word-prediction-task

Repository files navigation

Personalized Autocomplete: Next-Word Prediction Task

This project is the result of my semester-long exploration into next word prediction tasks, where I aimed to create a personalized autocomplete system using three distinct architectures: stacked LSTMs, Seq2Seq with Attention and LSTMs, and GPT-2.

Three Distinct Subprojects

This project is divided into three major subprojects, each progressively more complex and powerful:

  1. Stacked LSTMs
  2. Sequence-to-Sequence with Attention and LSTMs
  3. GPT-2: In this final project, I've implemented GPT-2 from scratch. and this is where things get truly intriguing and intricate.

Navigating the Repository

While it's ideal to follow the order presented in this repository as they build on top of one another and go from the simplest to the most advanced, this would necessitate a lot of time. Therefore, I recommend diving straight into the final GPT-2 project.

The GPT-2 project not only offers the most intricate insights but also includes comprehensive information about the previous architectures and their explanations.

Get in Touch

If you have any questions, feedback, or would like access to the Colab notebooks used in these projects, please don't hesitate to reach out to me. I'm here to help and share the knowledge I've gained during this exciting journey. Happy coding! 🚀

About

A personalized autocomplete (next word prediction) project using three different architectures: stacked LSTMs, Seq2Seq with Attention and LSTMs and GPT-2, written from scratch.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published