Temporary remove unused tokens during training to save ram and speed.
-
Updated
Jun 27, 2024 - Python
Temporary remove unused tokens during training to save ram and speed.
Seq2SeqSharp is a tensor based fast & flexible deep neural network framework written by .NET (C#). It has many highlighted features, such as automatic differentiation, different network types (Transformer, LSTM, BiLSTM and so on), multi-GPUs supported, cross-platforms (Windows, Linux, x86, x64, ARM), multimodal model for text and images and so on.
✨ Solve multi_dimensional multiple knapsack problem using state_of_the_art Reinforcement Learning Algorithms and transformers
Contains pytorch implementation of Transformer and EvolvedTransformer architectures. WIP
A comprehensive code for AI & Robotics.
Use Transformers to predict Time-Series smartphone positioning from GNSS data.
Transformer-encoder for performing a NLP task
Zeta implemantion of "Rethinking Attention: Exploring Shallow Feed-Forward Neural Networks as an Alternative to Attention Layers in Transformers"
[IGARSS'22]: A Transformer-Based Siamese Network for Change Detection
Transformer Encoder with Multiscale Deep Learning for Pain Classification Using Physiological Signals
Augmenting Multitask Learning for Multiclass Anomaly Severity Detection on Drone Flight Logs
Deep Learning project October 2023
Using the similarity between embedded protein sequences to align them
The implementation of DeBERTa
A comparative study of deep learning models to correctly identify the cancer a patient has, as a means to creating a more streamlined process when making a post on the Cancer Survivors Network website.,
TAG-DTA: Binding Region-Guided Strategy to Predict Drug-Target Affinity Using Transformers
Projects based on self learning purpose.
A deep learning classification tool for anomalous diffusion trajectories.
Transformer-Based Named Entity Recognition on Drone Flight Logs to Support Forensic Investigation
Repository for a transformer I coded from scratch and trained on the tiny-shakespeare dataset.
Add a description, image, and links to the transformer-encoder topic page so that developers can more easily learn about it.
To associate your repository with the transformer-encoder topic, visit your repo's landing page and select "manage topics."