Replies: 1 comment
-
we will add PPO + LSTM demo in this February. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi
Would it be possible you to share the idea of how to add LSTM memory to the PPO algorithm in the DI-Engine ?
I am working on the Gym-hybrid environment using H-PPO
Thank you
Beta Was this translation helpful? Give feedback.
All reactions