Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rework --llm-config CLI arg #2957

Merged
merged 1 commit into from
Jul 16, 2024
Merged

Conversation

li-boxuan
Copy link
Collaborator

@li-boxuan li-boxuan commented Jul 16, 2024

What is the problem that this fixes or functionality that this introduces? Does it fix any open issues?

After #2756, the semantics of --llm-config CLI arg becomes ambiguous: what exactly does it mean to provide a LLM config group via CLI, especially when agent delegation is involved?


Give a summary of what the PR does, explaining any non-trivial design decisions

This PR makes it clear that --llm-config allows one to override default LLM config ([llm] section in config.toml) with a given config group. For example, say you have

[llm.eval_gpt4_1106_preview_llm]
model = "gpt-4-1106-preview"
api_key = "XXX"
temperature = 0.0

[llm.eval_some_openai_compatible_model_llm]
model = "openai/MODEL_NAME"
base_url = "https://OPENAI_COMPATIBLE_URL/v1"
api_key = "XXX"
temperature = 0.0

Then you can use --llm-config eval_gpt4_1106_preview_llm or --llm-config eval_some_openai_compatible_model_llm to control the default LLM config used in the entire lifecycle of main.py.

What if I also have the following in my config.yaml?

[agent.BrowsingAgent]
llm_config="gpt-4o"

Then BrowsingAgent would use llm.gpt-4o while other agents would use --llm-config.

@li-boxuan
Copy link
Collaborator Author

li-boxuan commented Jul 16, 2024

Testing in progress...

Done. Working as expected.

Copy link
Collaborator

@xingyaoww xingyaoww left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! LGTM!

else:
llm = LLM(llm_config=config.get_llm_config_from_agent(args.agent_cls))
config.set_llm_config(llm_config)
llm = LLM(llm_config=config.get_llm_config_from_agent(args.agent_cls))
Copy link
Collaborator

@xingyaoww xingyaoww Jul 16, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does it mean that, if I still specify the model config for a particular agent (e.g., browsing), while also override default config via --llm-config - It will use the overrided default config (--llm-config) for other agents, and will use the agent-specific config in config.toml for browsing agent?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yep

@xingyaoww xingyaoww enabled auto-merge (squash) July 16, 2024 03:55
@xingyaoww xingyaoww merged commit e3e437f into main Jul 16, 2024
@xingyaoww xingyaoww deleted the config/rework-llm-config-cli-arg branch July 16, 2024 04:18
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants