-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: consolidate azure with openai provider #60
Conversation
draft until #54 |
assert reply_message.content == [Text(text="Hello!")] | ||
assert reply_usage.total_tokens == 35 | ||
mock_post.assert_called_once_with( | ||
f"{azure_provider.client.base_url}/chat/completions?api-version={azure_provider.api_version}", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
hard to see here, but it is the same openai endpoint as normal openai, but only at chat/completions. This is why the openai python SDK needs little config to switch to azure
user-agent: | ||
- python-httpx/0.27.2 | ||
method: POST | ||
uri: https://test.openai.azure.com/openai/deployments/test-azure-deployment/chat/completions?api-version=2024-05-01-preview |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@michaelneale this is a scrubbed real request/response
@anuraaga this has two commits, but if any interest in the azurification |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just some nits, LGTM
tests/providers/conftest.py
Outdated
This scrubs sensitive request data in provider-specific way. Note that headers | ||
are case-sensitive! | ||
""" | ||
if "openai" in request.uri: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Using the presence of openai
in the URI to indicate azure is still blowing my mind, especially so since it's not obvious the uri doesn't contain the host which could otherwise have been api.openai.com
.
Perhaps request.headers["host"] == os.environ.get("AZURE_CHAT_COMPLETIONS_HOST_NAME")
is an option? Admittedly not great either, just a note to see if there isn't anything to improve here
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this was a goof on my side, I meant to put openai.azure.com which I think we can keep as a condition . The problem with using that ENV variable is that it is weird, like it isn't a hostname, rather a base URL.
19c2c3b
to
42e5d99
Compare
currently at least the azure deployment I'm using doesn't support vision even though the model seems the same as openai. For now, not adding the vision tests, but someone can try again in a subsequent PR perhaps with another deployment:
|
@anuraaga can you verify I am understanding right, basically that we can't use vision yet? I've tried gpt-4 and no dice, but above docs make sense e.g. below fail before getting to assertions with invalid_request_error @pytest.mark.vcr()
def test_azure_vision(default_azure_env):
reply_message, reply_usage = vision(AzureProvider, AZURE_MODEL)
assert reply_message.content == [Text(text='The first entry in the menu says "Ask Goose."')]
assert reply_usage.total_tokens == 14241
@pytest.mark.integration
def test_azure_vision_integration():
reply = vision(AzureProvider, AZURE_MODEL)
assert "ask goose" in reply[0].text.lower() |
Signed-off-by: Adrian Cole <[email protected]>
Signed-off-by: Adrian Cole <[email protected]>
Signed-off-by: Adrian Cole <[email protected]>
Signed-off-by: Adrian Cole <[email protected]>
42e5d99
to
cf13320
Compare
Signed-off-by: Adrian Cole <[email protected]>
base_url=url, | ||
timeout=httpx.Timeout(60 * 10), | ||
) | ||
ollama_url = os.environ.get("OLLAMA_HOST", OLLAMA_HOST) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@baxen lemme know if you think this is more sensible than before
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yeah this looks good!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks great!
note: in order to be able to run
just integration -k azure
and not also run openai and ollama, I needed to get rid of the openai subdir in the tests (as it was before I started)