Skip to content

Commit

Permalink
Merge pull request instructlab#1575 from leseb/fix-log-ctx-gpu
Browse files Browse the repository at this point in the history
fix: serve log message with gpu layer and context size values
  • Loading branch information
mergify[bot] authored Jul 3, 2024
2 parents f2c977e + 34312b4 commit efbd310
Showing 1 changed file with 4 additions and 0 deletions.
4 changes: 4 additions & 0 deletions src/instructlab/model/serve.py
Original file line number Diff line number Diff line change
Expand Up @@ -93,6 +93,10 @@ def serve(
# Redirect server stdout and stderr to the logger
log.stdout_stderr_to_logger(logger, log_file)

if gpu_layers is None:
gpu_layers = ctx.obj.config.serve.llama_cpp.gpu_layers
if max_ctx_size is None:
max_ctx_size = ctx.obj.config.serve.llama_cpp.max_ctx_size
logger.info(
f"Using model '{model_path}' with {gpu_layers} gpu-layers and {max_ctx_size} max context size."
)
Expand Down

0 comments on commit efbd310

Please sign in to comment.