Massively Parallel Deep Reinforcement Learning. 🔥
-
Updated
Nov 12, 2024 - Python
Massively Parallel Deep Reinforcement Learning. 🔥
CURL: Contrastive Unsupervised Representation Learning for Sample-Efficient Reinforcement Learning
Related papers for reinforcement learning, including classic papers and latest papers in top conferences
Solutions of assignments of Deep Reinforcement Learning course presented by the University of California, Berkeley (CS285) in Pytorch framework
A repository for implementation of deep reinforcement learning lectured at Samsung
TD3, SAC, IQN, Rainbow, PPO, Ape-X and etc. in TF1.x
Mirror Descent Policy Optimization
A repository for code of reinforcement learning algorithms with PyTorch
Reinforcement Learning - Implementation of Exercises, algorithms from the book Sutton Barto and David silver's RL course in Python, OpenAI Gym.
D2C(Data-driven Control Library) is a library for data-driven control based on reinforcement learning.
Implementation of the CartPole from OpenAI's Gym using only visual input for Reinforcement Learning control with DQN
CURLA: CURL x CARLA -- Robust end-to-end Autonomous Driving by combining Contrastive Learning and Reinforcement Learning
💡 Grasp - Pick-and-place with a robotic hand 👨🏻💻
Bayesian Actor-Critic with Neural Networks. Developing an OpenAI Gym toolkit for Bayesian AC reinforcement learning.
The repo for the FERMI FEL paper using model-based and model-free reinforcement learning methods to solve a particle accelerator operation problem.
This repository contains all of the Reinforcement Learning-related projects I've worked on. The projects are part of the graduate course at the University of Tehran.
Deep Reinforcement Learning implementation in Keras of an AI controlling the popular Flappy Bird videogame, using Asynchronous Advantage Actor Critic (A3C)
This repo contains implementations of algorithms such a Q-learning, SARSA, TD, Policy gradient
Deep RL for Temporal Credit Assignment in decision processes with delayed rewards
Add a description, image, and links to the model-free-rl topic page so that developers can more easily learn about it.
To associate your repository with the model-free-rl topic, visit your repo's landing page and select "manage topics."