Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: completion and infill #164

Merged
merged 7 commits into from
Feb 18, 2024
Merged

feat: completion and infill #164

merged 7 commits into from
Feb 18, 2024

Conversation

giladgd
Copy link
Contributor

@giladgd giladgd commented Feb 17, 2024

Description of change

  • feat: add LlamaCompletion that provides the ability to complete or infill text
  • feat: support configuring more options for getLlama when using "lastBuild"
  • fix: various bug fixes

Infill, also known as fill-in-middle, is used to generate a completion for an input that should connect to a given continuation.
For example, for a prefix input 123 and suffix input 789, the model is expected to generate 456 to make the final text be 123456789.

Not every model supports infill, so only those that do can be used for generating an infill.

How to generate a completion

import {fileURLToPath} from "url";
import path from "path";
import {getLlama, LlamaModel, LlamaContext, LlamaCompletion} from "node-llama-cpp";

const __dirname = path.dirname(fileURLToPath(import.meta.url));

const llama = await getLlama();
const model = new LlamaModel({
    llama,
    modelPath: path.join(__dirname, "models", "stable-code-3b.Q5_K_M.gguf")
});
const context = new LlamaContext({
    model,
    contextSize: Math.min(4096, model.trainContextSize)
});
const completion = new LlamaCompletion({
    contextSequence: context.getSequence()
});

const input = "const arrayFromOneToTwenty = [1, 2, 3,";
console.log("Input: " + input);

const res = await completion.generateCompletion(input);
console.log("Completion: " + res);

In this example I used this model

How to generate an infill

import {fileURLToPath} from "url";
import path from "path";
import {getLlama, LlamaModel, LlamaContext, LlamaCompletion, UnsupportedError} from "node-llama-cpp";

const __dirname = path.dirname(fileURLToPath(import.meta.url));

const llama = await getLlama();
const model = new LlamaModel({
    llama,
    modelPath: path.join(__dirname, "models", "stable-code-3b.Q5_K_M.gguf")
});
const context = new LlamaContext({
    model,
    contextSize: Math.min(4096, model.trainContextSize)
});
const completion = new LlamaCompletion({
    contextSequence: context.getSequence()
});

if (!completion.infillSupported)
    throw new UnsupportedError("Infill completions are not supported by this model");

const prefix = "const arrayFromOneToFourteen = [1, 2, 3, ";
const suffix = "10, 11, 12, 13, 14];";
console.log("prefix: " + prefix);
console.log("suffix: " + suffix);

const res = await completion.generateInfillCompletion(prefix, suffix);
console.log("Infill: " + res);

In this example I used this model

Pull-Request Checklist

  • Code is up-to-date with the master branch
  • npm run format to apply eslint formatting
  • npm run test passes with this change
  • This pull request links relevant issues as Fixes #0000
  • There are new or updated unit tests validating the change
  • Documentation has been updated to reflect this change
  • The new commits and pull request title follow conventions explained in pull request guidelines (PRs that do not follow this convention will not be merged)

@giladgd giladgd self-assigned this Feb 17, 2024
Copy link
Contributor

@ido-pluto ido-pluto left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@giladgd giladgd merged commit ede69c1 into beta Feb 18, 2024
11 checks passed
@giladgd giladgd deleted the gilad/completion branch February 18, 2024 18:38
@giladgd giladgd mentioned this pull request Feb 18, 2024
17 tasks
Copy link

🎉 This PR is included in version 3.0.0-beta.11 🎉

The release is available on:

Your semantic-release bot 📦🚀

Copy link

github-actions bot commented Sep 24, 2024

🎉 This PR is included in version 3.0.0 🎉

The release is available on:

Your semantic-release bot 📦🚀

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Development

Successfully merging this pull request may close these issues.

2 participants