Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(minor): add more flags to the chat command #147

Merged
merged 9 commits into from
Jan 24, 2024

Conversation

giladgd
Copy link
Contributor

@giladgd giladgd commented Jan 22, 2024

Description of change

  • feat: add --systemPromptFile flag to the chat command
  • feat: add --promptFile flag to the chat command
  • feat: add --batchSize flag to the chat command

Based on #145 by @stewartoallen, adapted for the beta branch.

Pull-Request Checklist

  • Code is up-to-date with the master branch
  • npm run format to apply eslint formatting
  • npm run test passes with this change
  • This pull request links relevant issues as Fixes #0000
  • There are new or updated unit tests validating the change
  • Documentation has been updated to reflect this change
  • The new commits and pull request title follow conventions explained in pull request guidelines (PRs that do not follow this convention will not be merged)

giladgd and others added 9 commits November 26, 2023 21:29
* feat: evaluate multiple sequences in parallel with automatic batching
* feat: improve automatic chat wrapper resolution
* feat: smart context shifting
* feat: improve TS types
* refactor: improve API
* build: support beta releases
* build: improve dev configurations

BREAKING CHANGE: completely new API (docs will be updated before a stable version is released)
* feat: function calling support
* feat: stateless `LlamaChat`
* feat: improve chat wrapper
* feat: `LlamaText` util
* test: add basic model-dependent tests
* fix: threads parameter
* fix: disable Metal for `x64` arch by default
# Conflicts:
#	llama/addon.cpp
#	src/llamaEvaluator/LlamaContext.ts
#	src/llamaEvaluator/LlamaModel.ts
#	src/utils/getBin.ts
* feat: get embedding for text
* feat(minor): improve `resolveChatWrapperBasedOnModel` logic
* style: improve GitHub release notes formatting
# Conflicts:
#	llama/addon.cpp
#	src/cli/commands/ChatCommand.ts
#	src/llamaEvaluator/LlamaContext.ts
#	src/utils/getBin.ts
@giladgd giladgd self-assigned this Jan 22, 2024
@giladgd giladgd enabled auto-merge (squash) January 22, 2024 21:57
@giladgd giladgd disabled auto-merge January 24, 2024 21:33
@giladgd giladgd merged commit 1cae987 into master Jan 24, 2024
10 of 11 checks passed
@giladgd giladgd deleted the gilad/chatCommandPromptFile branch January 24, 2024 21:34
@giladgd giladgd restored the gilad/chatCommandPromptFile branch January 24, 2024 21:35
@giladgd
Copy link
Contributor Author

giladgd commented Jan 24, 2024

Merged to master by mistake, reverting

@giladgd giladgd deleted the gilad/chatCommandPromptFile branch January 24, 2024 21:51
Copy link

github-actions bot commented Sep 24, 2024

🎉 This PR is included in version 3.0.0 🎉

The release is available on:

Your semantic-release bot 📦🚀

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant