Skip to content

freechat-fun/freechat

Repository files navigation

FreeChat: Create Friends for Yourself with AI

English | 中文版

Introduction

Welcome! FreeChat aims to build a cloud-native, robust, and quickly commercializable enterprise-level AI virtual character platform.

It also serves as a prompt engineering platform.

It is recommended to run Ollama + FreeChat locally to test your models. See the instructions below for running locally

Features

  • Primarily uses Java and emphasizes security, robustness, scalability, traceability, and maintainability.
  • Boasts account systems and permission management, supporting OAuth2 authentication. Introduces the "organization" concept and related permission constraint functions.
  • Extensively employs distributed technologies and caching to support high concurrency access.
  • Provides flexible character customization options, supports direct intervention in prompts, and supports configuring multiple backends for each character.
  • Offers a comprehensive range of Open APIs, with more than 180 interfaces and provides java/python/typescript SDKs. These interfaces enable easy construction of systems for end-users.
  • Supports setting RAG (Retrieval Augmented Generation) for characters.
  • Supports long-term memory, preset memory for characters.
  • Supports characters evoking proactive chat.
  • Supports setting quota limits for characters.
  • Supports characters importing and exporting.
  • Supports character-to-character chats.
  • Supports individual debugging and sharing prompts.

Snapshots

On PC

Home Page

Home Page Snapshot

Development View

Development View Snapshot

Chat View

Chat View Snapshot

On Mobile

Chat Snapshot 1 Chat Snapshot 2
Chat Snapshot 3 Chat Snapshot 4

Character Design

flowchart TD
    A(Character) --> B(Profile)
    A --> C(Knowledge/RAG)
    A --> D(Album)
    A --> E(Backend-1)
    A --> F(Backend-n...)
    E --> G(Message Window)
    E --> H(Long Term Memory Settings)
    E --> I(Quota Limit)
    E --> J(Chat Prompt Task)
    E --> K(Greeting Prompt Task)
    E --> L(Moderation Settings)
    J --> M(Model & Parameters)
    J --> N(API Keys)
    J --> O(Prompt Refence)
    J --> P(Tool Specifications)
    O --> Q(Template)
    O --> R(Variables)
    O --> S(Version)
    O --> T(...)
    style K stroke-dasharray: 5, 5
    style L stroke-dasharray: 5, 5
    style P stroke-dasharray: 5, 5
Loading

After setting up an unified persona and knowledge for a character, different backends can be configured. For example, different model may be adopted for different users based on cost considerations.

How to Play

Online Website

You can visit freechat.fun to experience FreeChat. Share your designed AI character!

Running in a Kubernetes Cluster

FreeChat is dedicated to the principles of cloud-native design. If you have a Kubernetes cluster, you can deploy FreeChat to your environment by following these steps:

  1. Put the Kubernetes configuration file in the configs/helm/ directory, named kube-private.conf.

  2. Place the Helm configuration file in the same directory, named values-private.yaml. Make sure to reference the default values.yaml and customize the variables as needed.

  3. Switch to the scripts/ directory.

  4. If needed, run install-in.sh to deploy ingress-nginx on the Kubernetes cluster.

  5. If needed, run install-cm.sh to deploy cert-manager on the Kubernetes cluster, which automatically issues certificates for domains specified in ingress.hosts.

  6. Run install-pvc.sh to install PersistentVolumeClaim resources.

    By default, FreeChat operates files by accessing the "local file system." You may want to use high-availability distributed storage in the cloud. As a cloud-native-designed system, we recommend interfacing through Kubernetes CSI to avoid individually adapting storage products for each cloud platform. Most cloud service providers offer cloud storage drivers for Kubernetes, with a series of predefined StorageClass resources. Please choose the appropriate configuration according to your actual needs and set it in Helm's global.storageClass option.

    In the future, FreeChat may be refactored to use MinIO's APIs directly, as it is now installed in the Kubernetes cluster as a dependency (serving Milvus).

  7. Run install.sh script to install FreeChat and its dependencies.

  8. FreeChat aims to provide Open API services. If you like the interactive experience of freechat.fun, run install-web.sh to deploy the front-end application.

  9. Run restart.sh to restart the service.

  10. If you modified any Helm configuration files, use upgrade.sh to update the corresponding Kubernetes resources.

  11. To remove specific resources, run the uninstall*.sh script corresponding to the resource you want to uninstall.

As a cloud-native application, the services FreeChat relies on are obtained and deployed to your cluster through the helm repository.

If you prefer cloud services with SLA (Service Level Agreement) guarantees, simply make the relevant settings in configs/helm/values-private.yaml:

mysql:
  enabled: false
  url: <your mysql url>
  auth:
    rootPassword: <your mysql root password>
    username: <your mysql username>
    password: <your mysql password for the username>

redis:
  enabled: false
  url: <your redis url>
  auth:
    password: <your redis password>


milvus:
  enabled: false
  url: <your milvus url>
  milvus:
    auth:
      token: <your milvus api-key>

With this, FreeChat will not automatically install these services, but rather use the configuration information to connect directly.

If your Kubernetes cluster does not have a standalone monitoring system, you can enable the following switch. This will install Prometheus and Grafana services in the same namespace, dedicated to monitoring the status of the services under the FreeChat application:

prometheus:
  enabled: true
grafana:
  enabled: true

Running Locally

You can also run FreeChat locally. Currently supported on MacOS and Linux (although only tested on MacOS). You need to install the Docker toolset and have a network that can access Docker Hub.

Once ready, enter the scripts/ directory and run local-run.sh, which will download and run the necessary docker containers. After a successful startup, you can access http://localhost via a browser to see the locally running freechat.fun. The built-in administrator username and password are "admin:freechat". Use local-run.sh --help to view the supported options of the script. Good luck!

Running in an IDE

To run FreeChat in an IDE, you need to start all dependent services first but do not need to run the container for the FreeChat application itself. You can execute the scripts/local-deps.sh script to start services like MySQL, Redis, Milvus, etc., locally. Once done, open and debug freechat-start/src/main/java/fun/freechat/Application.java。Make sure you have set the following startup VM options:

-Dspring.config.location=classpath:/application.yml,classpath:/application-local.yml \
-DAPP_HOME=local-data/freechat \
-Dspring.profiles.active=local

Use SDK

Java

  • Dependency
<dependency>
  <groupId>fun.freechat</groupId>
  <artifactId>freechat-sdk</artifactId>
  <version>${freechat-sdk.version}</version>
</dependency>
  • Example
import fun.freechat.client.ApiClient;
import fun.freechat.client.ApiException;
import fun.freechat.client.Configuration;
import fun.freechat.client.api.AccountApi;
import fun.freechat.client.auth.ApiKeyAuth;
import fun.freechat.client.model.UserDetailsDTO;

public class AccountClientExample {
    public static void main(String[] args) {
        ApiClient defaultClient = Configuration.getDefaultApiClient();
        defaultClient.setBasePath("https://freechat.fun");
        
        // Configure HTTP bearer authorization: bearerAuth
        HttpBearerAuth bearerAuth = (HttpBearerAuth) defaultClient.getAuthentication("bearerAuth");
        bearerAuth.setBearerToken("FREECHAT_TOKEN");

        AccountApi apiInstance = new AccountApi(defaultClient);
        try {
            UserDetailsDTO result = apiInstance.getUserDetails();
            System.out.println(result);
        } catch (ApiException e) {
            e.printStackTrace();
        }
    }
}

Python

  • Installation
pip install freechat-sdk
  • Example
import freechat_sdk
from freechat_sdk.rest import ApiException
from pprint import pprint

# Defining the host is optional and defaults to https://freechat.fun
# See configuration.py for a list of all supported configuration parameters.
configuration = freechat_sdk.Configuration(
    host = "https://freechat.fun"
)

# Configure Bearer authorization: bearerAuth
configuration = freechat_sdk.Configuration(
    access_token = os.environ["FREECHAT_TOKEN"]
)

# Enter a context with an instance of the API client
with freechat_sdk.ApiClient(configuration) as api_client:
    # Create an instance of the API class
    api_instance = freechat_sdk.AccountApi(api_client)

    try:
        details = api_instance.get_user_details()
        pprint(details)
    except ApiException as e:
        print("Exception when calling AccountClient->get_user_details: %s\n" % e)

TypeScript

  • Installation
npm install freechat-sdk --save
  • Example

Refer to FreeChatApiContext.tsx

System Dependencies

Projects
Application Framework Spring Boot
LLM Framework LangChain4j
Model Providers Ollama, OpenAI, Azure OpenAI, DashScope(Alibaba)
Database Systems MySQL, Redis, Milvus
Monitoring & Alerting Kube State Metrics, Prometheus, Promtail, Loki, Grafana
OpenAPI Tools Springdoc-openapi, OpenAPI Generator, OpenAPI Explorer

Collaboration

Application Integration

The FreeChat system is entirely oriented towards Open APIs. The site freechat.fun is developed using its TypeScript SDK and hardly depends on private interfaces. You can use these online interfaces to develop your own applications or sites, making them fit your preferences. Currently, the online FreeChat service is completely free and there are no current plans for charging.

Model Integration

FreeChat aims to explore AI virtual character technology with anthropomorphic characteristics. If you are researching this area and hope FreeChat supports your model, please contact us. We look forward to AI technology helping people create their own "soul mates" in the future.