🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
-
Updated
Jun 26, 2024 - Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
This repository contains a Seq2Seq (Sequence to Sequence) model designed for probabilistic forecasting.
Implementation of a Transformer model from scratch in PyTorch for language translation.
1 line for thousands of State of The Art NLP models in hundreds of languages The fastest and most accurate way to solve text problems.
Code to address Natural Language Generation Tasks via Transformer Architecture
NAACL '24 (Demo) / MlSys @ NeurIPS '23 - RedCoast: A Lightweight Tool to Automate Distributed Training and Inference
An elegent pytorch implement of transformers
Lingvo
torchsharp functionality check and model implementation
A numpy implementation of the Transformer model in "Attention is All You Need"
TensorFlow implementations of losses for sequence to sequence machine learning models
Gateway into the John Snow Labs Ecosystem
An improved implementation of Beam Search Decoding in RNN-based Seq2Seq Architecture
GraphMM: Graph-based Vehicular Map Matching by Leveraging Trajectory and Road Correlations
a dna sequence generation/classification using transformers
Sequence-to-sequence framework with a focus on Neural Machine Translation based on PyTorch
Developed a Sequence-to-Sequence (Seq2Seq) model with LSTM units for text summarization, utilizing the BBC News Summary dataset and implemented with an encoder-decoder architecture for effective information extraction and summarization.
Seq2SeqSharp is a tensor based fast & flexible deep neural network framework written by .NET (C#). It has many highlighted features, such as automatic differentiation, different network types (Transformer, LSTM, BiLSTM and so on), multi-GPUs supported, cross-platforms (Windows, Linux, x86, x64, ARM), multimodal model for text and images and so on.
Natural Language Processing (NLP)
Add a description, image, and links to the seq2seq topic page so that developers can more easily learn about it.
To associate your repository with the seq2seq topic, visit your repo's landing page and select "manage topics."