v6.0.0
·
132 commits
to main
since this release
6.0.0
Features
- Remove cap for
max_concurrency
inLimitedConcurrencyClient
. - Introduce abstract
LanguageModel
class to integrate with LLMs from any API- Every
LanguageModel
supports echo to retrieve log probs for an expected completion given a prompt
- Every
- Introduce abstract
ChatModel
class to integrate with chat models from any API- Introducing
Pharia1ChatModel
for usage with pharia-1 models. - Introducing
Llama3ChatModel
for usage with llama models.
- Introducing
- Upgrade
ArgillaWrapperClient
to use Argilla v2.x - (Beta) Add
DataClient
andStudioDatasetRepository
as connectors to Studio for submitting data. - Add the optional argument
generate_highlights
toMultiChunkQa
,RetrieverBasedQa
andSingleChunkQa
. This makes it possible to disable highlighting for performance reasons.
Fixes
- Increase number of returned
log_probs
inEloQaEvaluationLogic
to avoid missing a valid answer
Deprecations
- Removed
DefaultArgillaClient
- Deprecated
Llama2InstructModel
Breaking Changes
-
We needed to upgrade argilla-server image version from
argilla-server:v1.26.0
toargilla-server:v1.29.0
to maintain compatibility.- Note: We also updated our elasticsearch argilla backend to
8.12.2
Full Changelog: v5.1.0...v6.0.0
- Note: We also updated our elasticsearch argilla backend to