Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Prompt Profiles #3954

Closed
1 task done
Boostrix opened this issue May 7, 2023 · 10 comments · Fixed by #3375
Closed
1 task done

Prompt Profiles #3954

Boostrix opened this issue May 7, 2023 · 10 comments · Fixed by #3375

Comments

@Boostrix
Copy link
Contributor

Boostrix commented May 7, 2023

Duplicates

  • I have searched the existing issues

Summary 💡

Based on previous comments: #3858 (comment)

  • We have roughly 10+ open PRs that are all about augmenting/replacing hard-coded prompts.
  • Many of these cannot be committed due to the lack of testing/benchmarks
  • However, this causes a ton of potentially useful contributions to end up in the backlog

Thus, the idea is to add support for so called "Prompt Profiles". A profile would be the equivalent of a sub-directory that contains a single profiles.json file to list sub-directories with other prompt configs.

Given that prompt-related PRs are such a common thing but due to the lack of testing data also are so controversial, how about coming up with support for "prompt profiles", as in a shared folder with sub-folders that contain profiles for different prompts, so that people can tinker with different profiles without stepping on anyone's toes ?

This will enable people to experiment with different "prompt profiles", without affecting the stability/usefulness of the whole project. Also, this would present an option to benchmark (#1359) different prompft profiles against each other (#1354). So, if/when a prompt is promoted to become the official/main one, we can simply change a "default_profile" property.

Having this sort of infrastructure in place would also provide a good baseline for regression testing and benchmarking.
It would not directly be the solution proposed in #3858, but probably would be the only solution to have our cake and eat it, while also growing test data ?

Ideally we'd support a dedicated sub-folder for "prompt profiles" with the corresponding json to map directories to each profile - that way, we can have our cake and eat it, and still close 10+ PRs in the process, because these would then all become custom prompt profiles - without any chance to step on anyone's toe. Besides, this would also mean having 10+ different ways to benchmark/test the system in conjunction with different prompts EASILY

Background:

Examples 🌈

No response

Motivation 🔦

No response

@DGdev91
Copy link
Contributor

DGdev91 commented May 8, 2023

Like the idea. And i feel that my PR #3375 (wich you just mentioned) already pretty much covers it.

I guess we'll need to wait for the re-architecting to settle down before having it merged, but yes, i think it can be a good way to handle it.

@Boostrix
Copy link
Contributor Author

Boostrix commented May 8, 2023

Not having looked at your commits in detail yet, do you already support a custom directory or how difficult would it be to support a corresponding structure and with sub directories and a Json profile to tell the system which prompt directory to use?

@DGdev91
Copy link
Contributor

DGdev91 commented May 8, 2023

Not having looked at your commits in detail yet, do you already support a custom directory or how difficult would it be to support a corresponding structure and with sub directories and a Json profile to tell the system which prompt directory to use?

For now, it just reads everything for a single yaml file.
It shouldn't be hard to add a multi-directory structure, but... Why?
Can't a prompt profile be just a single configuration file? Why the need of different folders?

@Boostrix
Copy link
Contributor Author

Boostrix commented May 8, 2023

This way we could organize things right inside the repo (imagine having 10+ different prompt profiles supported) - and if/when a prompt profile spans multiple files, a folder is probably better to separate things?

@DGdev91
Copy link
Contributor

DGdev91 commented May 8, 2023

The PROMPT_SETTINGS_FILE can be configured to a full path. Without touching my PR, you could have the main prompt_settings.yaml in the project main folder and any other file everywhere you want. Even outside the project.
Yes, we can move that file in a "prompts" folder but that not what i was saying.
I don't understand why a single prompt should use more than one file. We can just have a folder where every file is a profile.

Also, it's perfectly fine if an user makes his own config, or maybe even a plugin, but i don't think the project should include different prompt profiles unless there's a specific need (the "dynamic prompts" feature from #3937 could be one case)

Maybe this can evolve in a plugin wich lets the user download different prompts from a repository for different needs

@DGdev91
Copy link
Contributor

DGdev91 commented May 8, 2023

Quoting you from #3858:

It was my understanding that prompt configs may span over several files - at which point a folder would seem natural (sharing / maintenance etc) - it's frankly also what I am used to when it comes to thinking about a profile of a complex app (Firefox, libre Office etc).
But maybe you're right and a single yaml will suffice?

i18n of a use case specific prompt file set would spring to mind?

Also the dynamic prompting idea is all about adapting prompts as needed, which might involve switching within an active profile?

It doesn't comes to my mind any use case where the same prompt should have different files, at least right now.
i18n isn't a good example. It could be a good idea to write a prompt in a different language, specially if in the future AutoGPT will handle different LLMs (working on it in #2594), but it would give different results than writing it on english. It should be treated ad a different configuration.

Also, switching profile on-the-fly isn't needed right now, because those prompts are only used in the first LLM call

@Boostrix
Copy link
Contributor Author

Boostrix commented May 8, 2023

The PROMPT_SETTINGS_FILE can be configured to a full path. Without touching my PR, you could have the main prompt_settings.yaml in the project main folder and any other file everywhere you want. Even outside the project.

Thanks for clarifying, indeed that sounds already very useful "as is".

I don't understand why a single prompt should use more than one file. We can just have a folder where every file is a profile.

I guess we will see how things unfold, if I am trying to over-engineer a solution or if you are not seeing all use-cases yet :-)
The thing is, being able to spread out prompts over several files may make it easier to quickly tell which file needs to be adapted for a certain part of the prompt. We are already seeing a growing number of people using additional agent instances to add observer/monitoring behavior, using completely different prompt sets - in other words, the default/standard prompt is an actual problem for these use-case specific prompts (will edit this to add the corresponding PRs)

Maybe this can evolve in a plugin wich lets the user download different prompts from a repository for different needs

I am not sure if something as essential as a "prompt manager" should be a separate plugin - like you said, it would also be one of the lowest building blocks to support other LLM backends. Being able to neatly structure those according to the type of LLM using folders would seem natural to me - but we will see.

It doesn't comes to my mind any use case where the same prompt should have different files, at least right now.

We already have different prompt sets that people say perform "better" than the default prompt. These just aren't being accepted/reviewed currently due to a lack of benchmarks/testing. With your work (prompt profiles), we could easily test different prompts and compare them using the same underlying ai_settings.yaml files to see which prompt profiles perform better/worse than others.

switching profile on-the-fly isn't needed right now, because those prompts are only used in the first LLM call

See #dynamic prompting

Maybe, we should brainstorm some of the goals/non-goals here, and get @merwanehamadi involved, because he's the one actively pushing towards multi-prompt support for CI / regression testing purposes (which I wholeheartedly agree with!).

PS: Just to be clear about it, I am not suggesting to change your PR based on my feedback, we will need to wait for feedback from others obviously - if I am the only one foreseeing a directory based profile prompt approach, I am most likely to be mistake here :-)

@Boostrix
Copy link
Contributor Author

closing this, as it's basically in progress due to PR #3375 @DGdev91 ?

@DGdev91
Copy link
Contributor

DGdev91 commented May 16, 2023

closing this, as it's basically in progress due to PR #3375 @DGdev91 ?

Ok for me, if you feel that my PR covers your idea.
...Or we can discuss more about that, but i feel it's better to keep it as more simple as possible.
We can eventually expand it in the future if needed

@Boostrix
Copy link
Contributor Author

I think it's covered as is - the only thing where we were not quite on the same page is the sub-folder idea, and we haven't yet heard back from others. So at this point, it's more proabably more important to get SOME portion of this added, rather than any particular hypothetical version. Can still be revisited/reopened if the need arises.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants