This project emulates or mocks Azure OpenAI API deployments for testing and development purposes.
- Mock both streaming and non-streaming responses.
- Simulate token-based streaming for testing real-time API behaviors.
- Docker
- Python 3.12 (for local development)
To run this mock API using Docker, you can either pull the pre-built image directly from Docker Hub or build it locally:
Pull and run the Docker container from Docker Hub:
docker run -p 8000:8000 jkfran/azure-openai-mock-api
The API will be available at http://localhost:8000
.
-
Clone the repository:
git clone https://github.com/jkfran/azure-openai-deployment-mock.git cd azure-openai-deployment-mock
-
Build the Docker image:
docker build -t azure-openai-mock-api .
-
Run the Docker container:
docker run -p 8000:8000 azure-openai-mock-api
The API will be available at http://localhost:8000
. You can interact with it using tools like Postman or curl.
This endpoint simulates an OpenAI completion request.
api-key
: A required header to authenticate requests. You can useMOCK-AZURE-OPENAI-API-KEY-1234567890
as the valid key.
- stream: Set to
true
to enable streaming responses. Default isfalse
.
curl -X POST "http://localhost:8000/openai/deployments/mock-deployment/chat/completions" \
-H "api-key: MOCK-AZURE-OPENAI-API-KEY-1234567890" \
-d '{ "stream": true }'
To run the API locally without Docker, install the required Python dependencies:
pip install -r requirements.txt
Run the FastAPI server:
uvicorn mock_api:app --reload --host 0.0.0.0 --port 8000
Contributions are welcome! If you'd like to improve the mock API or add more features, feel free to fork the repository, make your changes, and open a pull request.
- Fork the project.
- Create a feature branch.
- Submit a pull request.
This project is open source and available under the MIT License.