Skip to content

Commit

Permalink
Switch repo to src/ layout (#27)
Browse files Browse the repository at this point in the history
* Move all the Python code to src/

* Convert Dockerfiles to use src/ and pyproject.toml

* Various bug fixes

* Update README
  • Loading branch information
JoshuaC215 authored Sep 2, 2024
1 parent f06432e commit 0c5d006
Show file tree
Hide file tree
Showing 23 changed files with 57 additions and 47 deletions.
43 changes: 24 additions & 19 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,12 +20,16 @@ This project offers a template for you to easily build and run your own agents u
Run directly in python

```sh
# An OPENAI_API_KEY is required
echo 'OPENAI_API_KEY=your_openai_api_key' >> .env
pip install -r requirements.txt
python run_service.py

# uv is recommended but pip also works
pip install uv
uv pip install -r pyproject.toml
python src/run_service.py

# In another shell
streamlit run streamlit_app.py
streamlit run src/streamlit_app.py
```

Run with docker
Expand Down Expand Up @@ -54,12 +58,12 @@ docker compose watch

The repository is structured as follows:

- `agent/research_assistant.py`: Defines the LangGraph agent
- `agent/llama_guard.py`: Defines the LlamaGuard content moderation
- `schema/schema.py`: Defines the service schema
- `service/service.py`: FastAPI service to serve the agent
- `client/client.py`: Client to interact with the agent service
- `streamlit_app.py`: Streamlit app providing a chat interface
- `src/agent/research_assistant.py`: Defines the LangGraph agent
- `src/agent/llama_guard.py`: Defines the LlamaGuard content moderation
- `src/schema/schema.py`: Defines the service schema
- `src/service/service.py`: FastAPI service to serve the agent
- `src/client/client.py`: Client to interact with the agent service
- `src/streamlit_app.py`: Streamlit app providing a chat interface

## Why LangGraph?

Expand Down Expand Up @@ -121,7 +125,7 @@ For local development, we recommend using [docker compose watch](https://docs.do

3. The services will now automatically update when you make changes to your code:
- Changes in the relevant python files and directories will trigger updates for the relevantservices.
- NOTE: If you make changes to the `requirements.txt` file, you will need to rebuild the services by running `docker compose up --build`.
- NOTE: If you make changes to the `pyproject.toml` file, you will need to rebuild the services by running `docker compose up --build`.

4. Access the Streamlit app by navigating to `http://localhost:8501` in your web browser.

Expand All @@ -137,19 +141,20 @@ You can also run the agent service and the Streamlit app locally without Docker,

1. Create a virtual environment and install dependencies:
```
python -m venv venv
source venv/bin/activate
pip install -r requirements.txt
pip install uv
uv venv
source .venv/bin/activate
uv pip install -r pyproject.toml
```

2. Run the FastAPI server:
```
python run_service.py
python src/run_service.py
```

3. In a separate terminal, run the Streamlit app:
```
streamlit run streamlit_app.py
streamlit run src/streamlit_app.py
```

4. Open your browser and navigate to the URL provided by Streamlit (usually `http://localhost:8501`).
Expand Down Expand Up @@ -182,14 +187,14 @@ Currently the tests need to be run using the local development without Docker se

To customize the agent for your own use case:

1. Modify the `agent/research_assistant.py` file to change the agent's behavior and tools. Or, build a new agent from scratch.
2. Adjust the Streamlit interface in `streamlit_app.py` to match your agent's capabilities.
1. Modify the `src/agent/research_assistant.py` file to change the agent's behavior and tools. Or, build a new agent from scratch.
2. Adjust the Streamlit interface in `src/streamlit_app.py` to match your agent's capabilities.

## Building other apps on the AgentClient

The repo includes a generic `client.AgentClient` that can be used to interact with the agent service. This client is designed to be flexible and can be used to build other apps on top of the agent. It supports both synchronous and asynchronous invocations, and streaming and non-streaming requests.
The repo includes a generic `src/client/client.AgentClient` that can be used to interact with the agent service. This client is designed to be flexible and can be used to build other apps on top of the agent. It supports both synchronous and asynchronous invocations, and streaming and non-streaming requests.

See the `run_client.py` file for full examples of how to use the `AgentClient`. A quick example:
See the `src/run_client.py` file for full examples of how to use the `AgentClient`. A quick example:

```python
from client import AgentClient
Expand Down
10 changes: 5 additions & 5 deletions docker/Dockerfile.app
Original file line number Diff line number Diff line change
Expand Up @@ -2,12 +2,12 @@ FROM python:3.12.3-slim

WORKDIR /app

COPY requirements.txt .
COPY pyproject.toml .
RUN pip install --no-cache-dir uv
RUN uv pip install --system --no-cache -r requirements.txt
RUN uv pip install --system --no-cache -r pyproject.toml

COPY client/ ./client/
COPY schema/ ./schema/
COPY streamlit_app.py .
COPY src/client/ ./client/
COPY src/schema/ ./schema/
COPY src/streamlit_app.py .

CMD ["streamlit", "run", "streamlit_app.py"]
12 changes: 6 additions & 6 deletions docker/Dockerfile.service
Original file line number Diff line number Diff line change
Expand Up @@ -2,13 +2,13 @@ FROM python:3.12.3-slim

WORKDIR /app

COPY requirements.txt .
COPY pyproject.toml .
RUN pip install --no-cache-dir uv
RUN uv pip install --system --no-cache -r requirements.txt
RUN uv pip install --system --no-cache -r pyproject.toml

COPY agent/ ./agent/
COPY schema/ ./schema/
COPY service/ ./service/
COPY run_service.py .
COPY src/agent/ ./agent/
COPY src/schema/ ./schema/
COPY src/service/ ./service/
COPY src/run_service.py .

CMD ["python", "run_service.py"]
3 changes: 2 additions & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ classifiers = [
"Programming Language :: Python :: 3.12",
]

requires-python = ">=3.9, <= 3.12.3"
requires-python = ">=3.9, <=3.12.3"

# NOTE: FastAPI < 0.100.0 and Pydantic v1 is required until langchain has full pydantic v2 compatibility
# https://python.langchain.com/v0.1/docs/guides/development/pydantic_compatibility/
Expand All @@ -36,6 +36,7 @@ dependencies = [
"pydantic ~=1.10.17",
"pyowm ~=3.3.0",
"python-dotenv ~=1.0.1",
"setuptools ~=74.0.0",
"streamlit ~=1.37.0",
"uvicorn ~=0.30.5",
]
Expand Down
1 change: 1 addition & 0 deletions requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -19,5 +19,6 @@ numexpr~=2.10.1
pydantic~=1.10.17
pyowm~=3.3.0
python-dotenv~=1.0.1
setuptools~=74.0.0
streamlit~=1.37.0
uvicorn~=0.30.5
7 changes: 0 additions & 7 deletions run_service.py

This file was deleted.

File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
19 changes: 10 additions & 9 deletions client/client.py → src/client/client.py
Original file line number Diff line number Diff line change
Expand Up @@ -212,12 +212,13 @@ async def acreate_feedback(
See: https://api.smith.langchain.com/redoc#tag/feedback/operation/create_feedback_api_v1_feedback_post
"""
request = Feedback(run_id=run_id, key=key, score=score, kwargs=kwargs)
response = await self.async_client.post(
f"{self.base_url}/feedback",
json=request.dict(),
headers=self._headers,
timeout=self.timeout,
)
if response.status_code != 200:
raise Exception(f"Error: {response.status_code} - {response.text}")
response.json()
async with httpx.AsyncClient() as client:
response = await client.post(
f"{self.base_url}/feedback",
json=request.dict(),
headers=self._headers,
timeout=self.timeout,
)
if response.status_code != 200:
raise Exception(f"Error: {response.status_code} - {response.text}")
response.json()
File renamed without changes.
9 changes: 9 additions & 0 deletions src/run_service.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
from dotenv import load_dotenv
import uvicorn

load_dotenv()

if __name__ == "__main__":
from service import app

uvicorn.run(app, host="0.0.0.0", port=80)
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.

0 comments on commit 0c5d006

Please sign in to comment.