Skip to content

Commit

Permalink
Revert "Merge branch 'main' into main"
Browse files Browse the repository at this point in the history
This reverts commit fa8ae15, reversing
changes made to 3af8ac0.
  • Loading branch information
sp6370 committed Jan 27, 2024
1 parent fa8ae15 commit da3a2b0
Show file tree
Hide file tree
Showing 149 changed files with 3,879 additions and 12,167 deletions.
4 changes: 2 additions & 2 deletions .vscode/settings.json
Original file line number Diff line number Diff line change
Expand Up @@ -25,9 +25,9 @@
"[python]": {
"editor.defaultFormatter": "ms-python.black-formatter",
"editor.formatOnSave": true,
"editor.rulers": [79]
"editor.rulers": [150]
},
"black-formatter.args": ["--line-length=79"],
"black-formatter.args": ["--line-length=150"],
// example: "--disable=C0114,C0115,C0116"
"pylint.args": []
}
37 changes: 0 additions & 37 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,42 +1,5 @@
# Changelog

## (2024-01-23) Python Version 1.1.15, NPM Version 1.1.7

Last PR included in this release: https://github.com/lastmile-ai/aiconfig/pull/995

### Features

- **sdk:** Updated input attachments with `AttachmentDataWithStringValue` type to distinguish the data representation ‘kind’ (`file_uri` or `base64`) ([#929](https://github.com/lastmile-ai/aiconfig/pull/929)). Please note that this can [break existing SDK calls](https://github.com/lastmile-ai/aiconfig/pull/932#discussion_r1456387863) for model parsers that use non-text inputs
- **editor:** Added telemetry data to log editor usage. Users can [opt-out of telemetry](<(https://aiconfig.lastmileai.dev/docs/editor/#telemetry)>) by setting `allow_usage_data_sharing: False` in the `.aiconfigrc` runtime configuration file ([#869](https://github.com/lastmile-ai/aiconfig/pull/869), [#899](https://github.com/lastmile-ai/aiconfig/pull/899), [#946](https://github.com/lastmile-ai/aiconfig/pull/946))
- **editor:** Added CLI `rage` command so users can submit bug reports ([#870](https://github.com/lastmile-ai/aiconfig/pull/870))
- **editor:** Changed streaming format to be output chunks for the running prompt instead of entire AIConfig ([#896](https://github.com/lastmile-ai/aiconfig/pull/896))
- **editor:** Disabled run button on other prompts if a prompt is currently running ([#907](https://github.com/lastmile-ai/aiconfig/pull/907))
- **editor:** Made callback handler props optional and no-op if not included ([#941](https://github.com/lastmile-ai/aiconfig/pull/941))
- **editor:** Added `mode` prop to customize UI themes on client, as well as match user dark/light mode system preferences ([#950](https://github.com/lastmile-ai/aiconfig/pull/950), [#966](https://github.com/lastmile-ai/aiconfig/pull/966))
- **editor:** Added read-only mode where editing of AIConfig is disabled ([#916](https://github.com/lastmile-ai/aiconfig/pull/916), [#935](https://github.com/lastmile-ai/aiconfig/pull/935), [#936](https://github.com/lastmile-ai/aiconfig/pull/936), [#939](https://github.com/lastmile-ai/aiconfig/pull/939), [#967](https://github.com/lastmile-ai/aiconfig/pull/967), [#961](https://github.com/lastmile-ai/aiconfig/pull/961), [#962](https://github.com/lastmile-ai/aiconfig/pull/962))
- **eval:** Generalized params to take in arbitrary dict instead of list of arguments ([#951](https://github.com/lastmile-ai/aiconfig/pull/951))
- **eval:** Created `@metric` decorator to make defining metrics and adding tests easier by only needing to define the evaluation metric implementation inside the inner function ([#988](https://github.com/lastmile-ai/aiconfig/pull/988))
- **python-sdk:** Refactored `delete_output` to set `outputs` attribute of `Prompt` to `None` rather than an empty list ([#811](https://github.com/lastmile-ai/aiconfig/pull/811))

### Bug Fixes / Tasks

- **editor:** Refactored run prompt server implementation to use `stop_streaming`, `output_chunk`, `aiconfig_chunk`, and `aiconfig` so server can more explicitly pass data to client ([#914](https://github.com/lastmile-ai/aiconfig/pull/914), [#911](https://github.com/lastmile-ai/aiconfig/pull/911))
- **editor:** Split `RUN_PROMPT` event on client into `RUN_PROMPT_START`, `RUN_PROMPT_CANCEL`, `RUN_PROMPT_SUCCESS`, and `RUN_PROMPT_ERROR` ([#925](https://github.com/lastmile-ai/aiconfig/pull/925), [#922](https://github.com/lastmile-ai/aiconfig/pull/922), [#924](https://github.com/lastmile-ai/aiconfig/pull/924))
- **editor:** Rearranged default model ordering to be more user-friendly ([#994](https://github.com/lastmile-ai/aiconfig/pull/994))
- **editor:** Centered the Add Prompt button and fixed styling ([#912](https://github.com/lastmile-ai/aiconfig/pull/912), [#953](https://github.com/lastmile-ai/aiconfig/pull/953))
- **editor:** Fixed an issue where changing the model for a prompt resulted in the model settings being cleared; now they will persist ([#964](https://github.com/lastmile-ai/aiconfig/pull/964))
- **editor:** Cleared outputs when first clicking the run button in order to make it clearer that new outputs are created ([#969](https://github.com/lastmile-ai/aiconfig/pull/969))
- **editor:** Fixed bug to display array objects in model input settings properly ([#902](https://github.com/lastmile-ai/aiconfig/pull/902))
- **python-sdk:** Fixed issue where we were referencing `PIL.Image` as a type instead of a module in the HuggingFace `image_2_text.py` model parser ([#970](https://github.com/lastmile-ai/aiconfig/pull/970))
- **editor:** Connected HuggingFace model parser tasks names to schema input renderers ([#900](https://github.com/lastmile-ai/aiconfig/pull/900))
- **editor:** Fixed `float` model settings schema renderer to `number` ([#989](https://github.com/lastmile-ai/aiconfig/pull/989))

### Documentation

- [new] Added [docs page](https://aiconfig.lastmileai.dev/docs/editor) for AIConfig Editor ([#876](https://github.com/lastmile-ai/aiconfig/pull/876), [#947](https://github.com/lastmile-ai/aiconfig/pull/947))
- [updated] Renamed “variables” to “parameters” to make it less confusing ([#968](https://github.com/lastmile-ai/aiconfig/pull/968))
- [updated] Updated Getting Started page with quickstart section, and more detailed instructions for adding API keys ([#956](https://github.com/lastmile-ai/aiconfig/pull/956), [#895](https://github.com/lastmile-ai/aiconfig/pull/895))

## (2024-01-11) Python Version 1.1.12, NPM Version 1.1.5

We built an AIConfig Editor which is like VSCode + Jupyter notebooks for AIConfig files! You can edit the config prompts, parameters, settings, and most importantly, run them for generating outputs. Source control your AIConfig files by clearing outputs and saving. It’s the most convenient way to work with Generative AI models through a local, user interface. See the [README](https://github.com/lastmile-ai/aiconfig/tree/v1.1.8/python/src/aiconfig/editor#readme) to learn more on how to use it!
Expand Down
8 changes: 1 addition & 7 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,13 +36,7 @@ It allows you to store and iterate on generative AI behavior _separately from yo

**[More context here](#why-is-this-important).**

## Quickstart

1. `pip3 install python-aiconfig`
2. `export OPENAI_API_KEY='your-key'`
3. `aiconfig edit`

## Getting Started Tutorial
## Getting Started

Check out the full [Getting Started tutorial](https://aiconfig.lastmileai.dev/docs/getting-started/).

Expand Down
23 changes: 7 additions & 16 deletions aiconfig-docs/docs/editor.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,6 @@ This guide covers the core features of AIConfig Editor and demonstrates how to:
- [Chain prompts](#chain-prompts)
- [Create prompt templates](#prompt-templates)
- [Add custom model parsers](#custom-model-parsers)
- [Telemetry](#telemetry)
- [FAQ](#faq)

Want to get started quickly? Check out our [Getting Started Tutorial](./getting-started).
Expand Down Expand Up @@ -119,7 +118,7 @@ Each cell in AIConfig Editor is used to prompt generative AI models and output r
| **Prompt Name** | The name of the prompt cell which can be referenced in other cells for chaining. |
| **Model** | The model you are prompting in this cell. Use the dropdown to see the available default models to AIConfig Editor. |
| **Settings** | The settings and parameters specific to the model (i.e. system prompt, temperature). These settings will vary depending on the model selected. |
| **Local Parameters** | These are parameters (variables) that you set to be used in the prompt via handlebars syntax. Local parameters are local to the cell and cannot be accessed in other cells. |
| **Local Variables (Parameters)** | These are variables that you set to be used in the prompt via handlebars syntax. Local variables are local to the cell and cannot be accessed in other cells. |

Click ▶️ at the right of the cell to execute the prompt and see the model response.

Expand All @@ -135,15 +134,15 @@ You can chain your prompts via the cell reference names and handlebars syntax. F

## Create Prompt Templates {#prompt-templates}

Prompt templates allow you to scale your prompts to different data inputs without needing to constantly modify the prompt itself. To do this in AIConfig Editor, parameters are used to pass in data to prompts. You can set both global and local parameters. Global Parameters can be used across all prompts defined in the editor whereas Local Parameters can only be used in the prompt cell they are defined for.
Prompt templates allow you to scale your prompts to different data inputs without needing to constantly modify the prompt itself. To do this in AIConfig Editor, variables are used to pass in data to prompts. You can set both global variables and local variables. Global Variables can be used across all prompts defined in the editor whereas Local Variables can only be used in the prompt cell they are defined for.

**Global Parameters**
You can set global parameters to be used across all cells in the editor. Click on `Global Parameters` at the top of the editor to expand the form to enter your global parameters.
**Global Variables**
You can set global variables to be used across all cells in the editor. Click on `Global Variables` at the top of the editor to expand the form to enter your global variables.

![image5](https://github.com/lastmile-ai/aiconfig/assets/129882602/9633b389-a9ae-4bbd-b9bd-5c965dbbdcaf)

**Local Parameters**
You can set local parameters to be used in specific cells in the editor. In the cell, expand the right pane and select `Local Parameters`.
**Local Variables**
You can set local variables to be used in specific cells in the editor. In the cell, expand the right pane and select `Local Variables (Parameters)`.

:::note
Local parameters will override the global parameters if they have the same name.
Expand All @@ -152,7 +151,7 @@ Local parameters will override the global parameters if they have the same name.
![image6](https://github.com/lastmile-ai/aiconfig/assets/129882602/3c4408e4-be34-4b13-bddc-2dff5df88bcd)

**Creating Prompt Templates**
Prompt templates are created using [handlebars syntax](https://handlebarsjs.com/guide/) for the parameters. Here is an example where `{{language}}` is defined as a global parameter. You can easily change the values of the parameter but keep the prompt template the same.
Prompt templates are created using [handlebars syntax](https://handlebarsjs.com/guide/) for the variables. Here is an example where `{{language}}` is defined as a global variable. You can easily change the values of the variable but keep the prompt template the same.

![image4](https://github.com/lastmile-ai/aiconfig/assets/129882602/4333b532-bc04-41c4-adcb-ce1e9c8ef8ea)

Expand All @@ -167,14 +166,6 @@ The AIConfig Editor is highly customizable and allows for custom models to be in
- text-summarization
- and much more!

## Telemetry {#telemetry}

AIConfig Editor collects telemetry data, which helps us understand how to improve the product. The telemetry helps us debug issues and prioritize new features.

**Disabling telemetry**

If you don't want to send any telemetry back to us to help the team build a better product, you can set `allow_usage_data_sharing` to `false` in the `$HOME/.aiconfigrc` configuration file.

## More Resources

Check out these resources on how you can use your AIConfig created from your AIConfig Editor in your application code.
Expand Down
14 changes: 3 additions & 11 deletions aiconfig-docs/docs/getting-started.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,14 +10,6 @@ import constants from '@site/core/tabConstants';

AIConfig is a framework that makes it easy to build generative AI applications for production. It manages generative AI prompts and model parameters as JSON-serializable configs that can be version controlled, evaluated, and opened in a local editor for rapid prototyping. Please read [AIConfig Basics](https://aiconfig.lastmileai.dev/docs/basics/) to learn more.

## Quickstart

1. `pip3 install python-aiconfig`
2. `export OPENAI_API_KEY='your-key'`
3. `aiconfig edit`

## Getting Started Tutorial

**In this tutorial, we will create a customizable NYC travel itinerary using AIConfig.**

## Install
Expand Down Expand Up @@ -90,7 +82,7 @@ For this tutorial, you will need to have an OpenAI API key that has access to GP

## Open AIConfig Editor

AIConfig Editor allows you to visually create and edit the prompt chains and model parameters that are stored as AIConfigs. You can also chain prompts and use global and local parameters in your prompts. Learn more about [AIConfig Editor](https://aiconfig.lastmileai.dev/docs/editor).
AIConfig Editor allows you to visually create and edit the prompt chains and model parameters that are stored as AIConfigs. You can also chain prompts and use global and local variables in your prompts. Learn more about [AIConfig Editor](https://aiconfig.lastmileai.dev/docs/editor).

1. Open your Terminal
2. Run this command: `aiconfig edit --aiconfig-path=travel.aiconfig.json`
Expand Down Expand Up @@ -147,13 +139,13 @@ Notice that your AIConfig JSON file updates with the prompt. Your work in AIConf

**4. Create your second prompt `gen_itinerary` which depends on your first prompt.**

This prompt uses GPT-4 to generate an itinerary based on the output of our first prompt `get_activities` (chaining) and a local variable `order_by`. Local parameters are local to the prompt cell whereas global parameters can be used across prompt cells in the editor. Run the prompt using the Play button.
This prompt uses GPT-4 to generate an itinerary based on the output of our first prompt `get_activities` (chaining) and a local variable `order_by`. Local variables are local to the prompt cell whereas global variables can be used across prompt cells in the editor. Run the prompt using the Play button.

![img_editor](https://github.com/lastmile-ai/aiconfig/assets/81494782/73558099-b42b-48d2-bac4-3023766da5a0)

**5. Click the **Save** button.**

Notice that your AIConfig JSON file updates with the second prompt, including the chaining logic and parameters. See below:
Notice that your AIConfig JSON file updates with the second prompt, including the chaining logic and variables. See below:

<details>
<summary>`travel.aiconfig.json`</summary>
Expand Down
12 changes: 3 additions & 9 deletions cookbooks/Basic-Prompt-Routing/assistant_app.py
Original file line number Diff line number Diff line change
Expand Up @@ -32,12 +32,8 @@ async def assistant_response(prompt):

# Streamlit Setup
st.title("AI Teaching Assistant")
st.markdown(
"Ask a math, physics, or general question. Based on your question, an AI math prof, physics prof, or general assistant will respond."
)
st.markdown(
"**This is a simple demo of prompt routing - based on your question, an LLM decides which AI teacher responds.**"
)
st.markdown("Ask a math, physics, or general question. Based on your question, an AI math prof, physics prof, or general assistant will respond.")
st.markdown("**This is a simple demo of prompt routing - based on your question, an LLM decides which AI teacher responds.**")

# Chat setup
if "messages" not in st.session_state:
Expand All @@ -58,6 +54,4 @@ async def assistant_response(prompt):
with st.chat_message("assistant"):
st.markdown(response)

st.session_state.messages.append(
{"role": "assistant", "content": response}
)
st.session_state.messages.append({"role": "assistant", "content": response})
11 changes: 2 additions & 9 deletions cookbooks/Basic-Prompt-Routing/create_config.py
Original file line number Diff line number Diff line change
@@ -1,17 +1,10 @@
from aiconfig import AIConfigRuntime, Prompt

aiconfig = AIConfigRuntime.create(
"assistant_config", "teaching assistant config"
)
aiconfig = AIConfigRuntime.create("assistant_config", "teaching assistant config")

# Set GPT-4 as default model from Teaching Assistant prompts
model_name = "gpt-4"
model_settings = {
"top_k": 40,
"top_p": 1,
"model": "gpt-4",
"temperature": 0.0,
}
model_settings = {"top_k": 40, "top_p": 1, "model": "gpt-4", "temperature": 0.0}
aiconfig.add_model(model_name, model_settings)


Expand Down
23 changes: 5 additions & 18 deletions cookbooks/Cli-Mate/cli-mate.py
Original file line number Diff line number Diff line change
Expand Up @@ -53,20 +53,13 @@ async def query(aiconfig_path: str, question: str) -> list[ExecuteResult]:
return result


async def get_mod_result(
aiconfig_path: str, source_code: str, question: str
) -> list[ExecuteResult]:
async def get_mod_result(aiconfig_path: str, source_code: str, question: str) -> list[ExecuteResult]:
question_about_code = f"QUERY ABOUT SOURCE CODE:\n{question}\nSOURCE CODE:\n```{source_code}\n```"

return await query(aiconfig_path, question_about_code)


async def mod_code(
aiconfig_path: str,
source_code_file: str,
question: str,
update_file: bool = False,
):
async def mod_code(aiconfig_path: str, source_code_file: str, question: str, update_file: bool = False):
# read source code from file
with open(source_code_file, "r", encoding="utf8") as file:
source_code = file.read()
Expand Down Expand Up @@ -100,9 +93,7 @@ def signal_handler(_: int, __: FrameType | None):
i = 0
while True:
try:
user_input = await event_loop.run_in_executor(
None, session.prompt, "Query: [ctrl-D to exit] "
)
user_input = await event_loop.run_in_executor(None, session.prompt, "Query: [ctrl-D to exit] ")
except KeyboardInterrupt:
continue
except EOFError:
Expand All @@ -122,9 +113,7 @@ def signal_handler(_: int, __: FrameType | None):
prompt = user_input

# Dynamically generate the prompt name and prompt object
new_prompt_name = (
f"prompt{len(runtime.prompts)+1}" # Prompt{number of prompts}
)
new_prompt_name = f"prompt{len(runtime.prompts)+1}" # Prompt{number of prompts}
new_prompt = Prompt(name=new_prompt_name, input=prompt)

# Add the new prompt and run the model
Expand Down Expand Up @@ -155,9 +144,7 @@ async def main():
subparsers = parser.add_subparsers(dest="command")

loop_parser = subparsers.add_parser("loop")
loop_parser.add_argument(
"-scf", "--source-code-file", help="Specify a source code file."
)
loop_parser.add_argument("-scf", "--source-code-file", help="Specify a source code file.")

args = parser.parse_args()

Expand Down
Empty file removed cookbooks/Getting-Started/log.txt
Empty file.
20 changes: 0 additions & 20 deletions cookbooks/Getting-Started/my_aiconfig.aiconfig.json

This file was deleted.

1 change: 0 additions & 1 deletion cookbooks/Gradio/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,6 @@ We have started by supporting the 6 most popular Hugging Face tasks (by number o
In addition we support the following **HF inference API**:

- text generation
- text summarization

### Gradio custom component

Expand Down
Loading

0 comments on commit da3a2b0

Please sign in to comment.