Testing of the possible use of transformers model for various NLP tasks leveraging BERT pretrained model from Hugginface
-
Updated
Nov 23, 2020 - Jupyter Notebook
Testing of the possible use of transformers model for various NLP tasks leveraging BERT pretrained model from Hugginface
This project consists of creating a streamlit app to summarize texts and identify entities. It uses both T5 and BART as summarization tools.
This repository contains my practice in learning llms, specifically BERT, T5, GPT-2
Finetune t5 model for text2text generation use case to generate human like text
Classification, ADSA and Text Summarisation based project for BridgeI2I Task at Inter IIT 2021 Competition. Silver Medalists.
Dealing with grammatical errors in Slovenian (school) written works
Factuality check of the SemRep Predications
TEXT SUMMARIZATION USING -> BART-T5-PROPHETNET-PEGASUS on indic_dataset
Basline: google/flan-t5 Finetuning: LMQG , LoRA
A Multimodal Approach to Convert Book Summaries into Artistic Book Covers
Unlock the depths of Wikipedia with WikiBot. It delivers concise summaries and suggests related articles to fuel your curiosity and guide your exploration.
Modern applied deep learning with Transformer model methodology.
This is a final project. This project is aimed for deep learning and text summarization exercises.
Add a description, image, and links to the t5 topic page so that developers can more easily learn about it.
To associate your repository with the t5 topic, visit your repo's landing page and select "manage topics."