Skip to content

Fill-in-the-BERT uses pre-trained BERT Masked Language Model for Infering the task of fill in the blanks.

License

Notifications You must be signed in to change notification settings

prakhar21/Fill-in-the-BERT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Fill-in-the-BERT

Google BERT Representative Image

Fill-in-the-BERT is a fill in the blanks model that is trained to predict the missing word in the sentence. For the purpose of this demo we will be using pre-trained bert-base-uncased as our prediction model. If you are new to BERT, please read BERT-Explained and Visual Guide to BERT

Steps to Run

$> python3 app.py

Open index.html in the browser and start typing 💬

Demo

Fill in the blanks - BERT demo

Details

  • Model - bert-base-uncased
  • Pre-trained Task - MaskedLM

P.S. The attention visualisation is done for layer 3 across all attention heads by taking their average. Read more about heads and what they mean at Visualizing inner workings of attention

Technologies Used

  1. PyTorch
  2. HTML/Bootstrap
  3. Flask

About

Fill-in-the-BERT uses pre-trained BERT Masked Language Model for Infering the task of fill in the blanks.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published