Arcadia comes from Greek mythology(a tranquil and idyllic region, representing harmony, serenity, and natural beauty). We aim to help everyone find a more perfect integration between humans and AI.
To achieve this goal, we provide this one-stop LLMOps solution:
- Dataset Management: storage/real-time data,multimodal,pre-processing,vectorization
- Models Management: local/online LLMs(development,training,deployment),inference acceleration
- Application Management: development,optimization,deployment with visual editor
Furthermore, we can easily host Arcadia at any kubernetes cluster as production ready by integrating kubebb(A kubernetes building blocks).
Our design and development in Arcadia design follows operator pattern which extends kubernetes APIs.
If you don't have a kubernetes cluster, you can schedule a kind cluster. Depends on your choice on CPU or GPU when running LLM worker, you can choose to:
Visit our online documents
Read user guide
We provide a Command Line Tool arctl
to interact with arcadia
. See here for more details.
- ✅ datasource management
- ✅ local dataset management
To enhance the AI capability in Golang, we developed some packages.Here are the examples of how to use them.
- chat_with_document: a chat server which allows you to chat with your document
- embedding shows how to embedes your document to vector store with embedding service
- rbac shows how to inquiry the security risks in your RBAC with AI.
- zhipuai shows how to use this zhipuai client
- dashscope shows how to use this dashscope client to chat with qwen-7b-chat / qwen-14b-chat / llama2-7b-chat-v2 / llama2-13b-chat-v2 and use embedding with dashscope text-embedding-v1 / text-embedding-async-v1
Fully compatible with langchain embeddings
Fully compatible with langchain vectorstores
- ✅ ChromaDB
If you want to contribute to Arcadia, refer to contribute guide.
If you need support, start with the troubleshooting guide, or create GitHub issues