Skip to content

The OpenAI Rust library provides convenient access to the OpenAI API from Rust applications.

License

Notifications You must be signed in to change notification settings

YanceyOfficial/rs-openai

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

63 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

OpenAI Rust Library

Crates.io Crates.io Crates.io docs.rs build rust-clippy analyze

The OpenAI Rust library provides convenient access to the OpenAI API from Rust applications.

Installation

[dependencies]
rs_openai = { version = "0.5.0" }

Features

  • Audio (including Text and Json response value)
  • Chat (including SSE streaming)
  • Completions (including SSE streaming)
  • Edits
  • Embeddings
  • Engines (Already deprecated)
  • Files
  • Fine-Tunes (including SSE streaming)
  • Images
  • Models
  • Moderations
  • Enhances backoff
  • Supports Microsoft Azure Endpoints

Usage

The library needs to be configured with your account's secret key, which is available on the website. We recommend setting it as an environment variable.

# .env
OPENAI_API_KEY=sk-...
OPENAI_API_ORGANIZATION=org-...

Here's an example of initializing the library with the API key loaded from an environment variable and creating a completion:

use dotenvy::dotenv;
use rs_openai::{
    chat::{ChatCompletionMessageRequestBuilder, CreateChatRequestBuilder, Role},
    OpenAI,
};
use std::env::var;

#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
    dotenv().ok();
    let api_key = var("OPENAI_API_KEY").unwrap();

    let client = OpenAI::new(&OpenAI {
        api_key,
        org_id: None,
    });

    let req = CreateChatRequestBuilder::default()
        .model("gpt-3.5-turbo")
        .messages(vec![ChatCompletionMessageRequestBuilder::default()
            .role(Role::User)
            .content("To Solve LeetCode's problem 81 in Rust.")
            .build()?])
        .build()?;

    let res = client.chat().create(&req).await?;
    println!("{:?}", res);

    Ok(())
}

Stream

Like ChatGPT, we support stream mode for Create chat completion, Create completion and List fine-tune events. In these cases, tokens will be sent as data-only server-sent events as they become available. Watch the demo for the following code.

use dotenvy::dotenv;
use futures::StreamExt;
use rs_openai::{
    chat::{ChatCompletionMessageRequestBuilder, CreateChatRequestBuilder, Role},
    OpenAI,
};
use std::env::var;
use std::io::{stdout, Write};

#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
    dotenv().ok();
    let api_key = var("OPENAI_API_KEY").unwrap();

    let client = OpenAI::new(&OpenAI {
        api_key,
        org_id: None,
    });


    // stream mode
    let req = CreateChatRequestBuilder::default()
        .model("gpt-3.5-turbo")
        .messages(vec![ChatCompletionMessageRequestBuilder::default()
            .role(Role::User)
            .content("To Solve LeetCode's problem 81 in Rust.")
            .build()?])
        .stream(true)
        .build()?;

    let mut stream = client.chat().create_with_stream(&req).await?;

    let mut lock = stdout().lock();
    while let Some(response) = stream.next().await {
        response.unwrap().choices.iter().for_each(|choice| {
            if let Some(ref content) = choice.delta.content {
                write!(lock, "{}", content).unwrap();
            }
        });

        stdout().flush()?;
    }

    Ok(())
}

Check out the full API documentation for examples of all the available functions.

Requirements

In general, we want to support the versions of Rust that our customers are using. If you run into problems with any version issues, please let us know at on our support page.

Contributing

The main purpose of this repository is to continue to evolve OpenAI Rust Library, making it faster and easier to use. Development of OpenAI Rust Library happens in the open on GitHub, and we are grateful to the community for contributing bugfixes and improvements. Read below to learn how you can take part in improving OpenAI Rust Library.

OpenAI Rust Library has adopted a Code of Conduct that we expect project participants to adhere to. Please read the full text so that you can understand what actions will and will not be tolerated.

Read our contributing guide to learn about our development process, how to propose bugfixes and improvements, and how to build and test your changes to OpenAI Rust Library.

Good Issues

Please make sure to read the Issue Reporting Checklist before opening an issue. Issues not conforming to the guidelines may be closed immediately.

Thanks

As a Rust beginner, lots of experience, thoughts and idea are came from 64bit's async-openai, Thank you and your project!

License

OpenAI Rust Library is licensed under the terms of the MIT licensed.

About

The OpenAI Rust library provides convenient access to the OpenAI API from Rust applications.

Topics

Resources

License

Code of conduct

Security policy

Stars

Watchers

Forks

Releases

No releases published

Sponsor this project

Packages

No packages published

Languages