MDPs and POMDPs in Julia - An interface for defining, solving, and simulating fully and partially observable Markov decision processes on discrete and continuous spaces.
-
Updated
Jun 28, 2024 - Julia
MDPs and POMDPs in Julia - An interface for defining, solving, and simulating fully and partially observable Markov decision processes on discrete and continuous spaces.
Deep Reinforcement Learning in C#
LLMTechSite, 专注于通用人工智能领域的技术生态。
🧑🏫 50! Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, ...), gans(cyclegan, stylegan2, ...), 🎮 reinforcement learning (ppo, dqn), capsnet, distillation, ... 🧠
Library for reinforcement learning with c++
OpenDILab Decision AI Engine
Reinforcement Learing (RL) for gymnasium CarRacing-v2 envirountment
Instructions and experiments for running SLM Lab in Google Colab with Atari Freeway Game
🏛️A research-friendly codebase for fast experimentation of single-agent reinforcement learning in JAX • End-to-End JAX RL
Reinforcement Learning Algorithms
Training robots to play soccer
PacmanRL - Reinforcement Learning for Pacman (Q-Learning / SARSA)
The GOLang implementation of NeuroEvolution of Augmented Topologies (NEAT) method to evolve and train Artificial Neural Networks without error back propagation
code for SIGKDD 2022 paper : Group-wise Reinforcement Feature Generation for Optimal and Explainable Representation Space Reconstruction
Really Fast End-to-End Jax RL Implementations
EVA Deep Reinforcement Learning. This Repository Contains my Submission to the MSE Deep Reinforcement Learning Project.
Collection of materials and code samples on reinforcement learning / optimal control and game theory
DI-engine docs (Chinese and English)
Add a description, image, and links to the reinforcement-learning-algorithms topic page so that developers can more easily learn about it.
To associate your repository with the reinforcement-learning-algorithms topic, visit your repo's landing page and select "manage topics."