ppo-SnowballTarget / README.md

Commit History

First training of PPO agent on SnowballTarget environment.
90e9ddb

atorre commited on