Arena is a new Embodied AI platform, for robotic task completion in simulated environments. Arena has been developed with an objective of advancing research in Human Robot Interaction (HRI) for robot task completion challenges. Building embodied agents for Arena involves working on key science aspects such as Multimodal Understanding and Reasoning, Embodied Conversational AI, Imitation and Reinforcement Learning, Teachable AI and Robotic Task planning.
This repository includes codebase to interact with the Arena executable. It also provides several scripts to fetch dataset, placeholder model, and other auxiliary tools. The Arena executable is subject to a separate license that allows use for non-commercial purposes only.
- Number of vCPUs: 8
- Number of GPUs: 1
- Memory: 32 GiB
- Storage: 200 GiB
- Operating system: Amazon Linux 2(ami-0496aeed90a040b1b), Ubuntu 18.04(ami-0475f1fb0e9b1f73f)
Please refer to this tutorial for information on how to create an AWS EC2 instance with the aforementioned configuration.
- Login to the EC2 instance from the AWS console
- Pull AlexaArena repository from GitHub to $HOME directory: https://github.com/amazon-science/alexa-arena
git clone https://github.com/amazon-science/alexa-arena.git AlexaArena
(note to clone into AlexaArena for script compatibility)- If you have not done so already, download aws cli from https://docs.aws.amazon.com/cli/latest/userguide/getting-started-install.html
- Run the script to download arena executable:
bash scripts/fetch_arena.sh
This script would download and extract arena binaries in folder:"$HOME/AlexaArena/arena/"
- Run "$HOME/AlexaArena/scripts/install_dependencies.sh"
- Once the script is finished, go to "AlexaArena/arena_installation_test" folder
- Run "./run_linux.sh". You should see "Arena dependencies installation test is completed successfully" if the installation is successful.
Note: The installation script mentioned above is tested on AWS EC2 instances [Instance types: g4dn.2xlarge, p3.8xlarge OS: Amazon Linux 2, Ubuntu 18.04]. If you plan to use different cloud based instance or local machine, the installation steps may vary. Please refer this to know about dependencies.
We provide two separate datasets to assist model training, which are made available under the CC BY-NC 4.0 license. The first dataset contains trajectory data with robot action trajectories annotated with human natural language instructions and question-answers. It may be useful for training and evaluating robot models for task completion. The second dataset contains image data generated via Arena, and it may be useful for training and evaluating vision models that can work in the Arena environment. Please find the detailed information about the data and how to download them here and here.
We also provide baseline models for robot task completion. Please find detailed information here
In addition to Arena simulator and model training/evaluation scripts, this package includes some auxiliary tools that could help during model development. Please find them below.
It is a web application that enables users to communicate with the robot in a simulated environment. It enables the control of an AI agent to alter object's state in real-time using a chat interface. It provides an ability to launch a game in CDF (Challenge Definition Format) file and complete it using natual language commands. Each command is sent to a model for generating a sequence of primitive actions. Following that, the Arena executable processes these actions and outcome is shown on web browser. Please find the web-tool UI below.
To learn more about it, please click here.
Arena debugger is a software tool that could be used to test and debug end-to-end workflow. There are three critical steps in debugging a model for Arena
- Process the input utterance and predict actions & object classes
- Predict mask for objects
- Generate a JSON using predicted mask and actions with required format
This tool allows developer to pause and inspect the outcome after every step. Please check the Arena Debugger README for more information.
The Alexa Arena Challenge is available on Eval AI platform. Please find more details here: https://eval.ai/web/challenges/challenge-page/1903/overview
For the challenge, this module offers the required code snippet to produce metadata output files. More information is available here
Alexa Arena has been used in:
- Alexa Arena: A User-Centric Interactive Platform for Embodied AI. PDF
Gao, Q., Thattai, G., Gao, X., Shakiah, S., Pansare, S., Sharma, V., ... & Natarajan, P.
arXiv preprint arXiv:2303.01586.
If you use the platform, please consider citing our paper.
@article{gao2023alexa,
title={Alexa Arena: A User-Centric Interactive Platform for Embodied AI},
author={Gao, Qiaozi and Thattai, Govind and Gao, Xiaofeng and Shakiah, Suhaila and Pansare, Shreyas and Sharma, Vasu and Sukhatme, Gaurav and Shi, Hangjie and Yang, Bofei and Zheng, Desheng and others},
journal={arXiv preprint arXiv:2303.01586},
year={2023}
}
See CONTRIBUTING for more information.
This library is licensed under the LGPL-2.1 License.