-
-
Notifications
You must be signed in to change notification settings - Fork 9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
main <- alpha #1
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is looking awesome!!
I'm unable to build due to a number of compiler errors (see nit errors left in this review)
Rust I'm building with locally:
❯ cargo --version
cargo 1.70.0 (ec8a8a0ca 2023-04-25)
❯ rustc --version
rustc 1.70.0 (90c541806 2023-05-31)
Looks like this is also a WIP 😅 so feel free to disregard my comments and ping me when this is ready to look at! Very cool!!
Hey. I'll be updating the build steps and there's todo!(), that's breaking the build. I working on it right now. Will let you know when it's ready. Thanks for taking a look. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Quick note: we'll want to keep the bar for contributors high, so let's make sure we add a README.md
and some development docs.
Right. Will do. |
Hey @jpmcb. The project now has build steps and routes documented. Can you take a look? Once this first draft is merged, we can look into further improving the output by tweaking src/conversation/prompts.rs, moving to GPT-4, adding auth to the service using Supabase, improving the semantic search. |
Awesome, this is on my docket to review today: great work!! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not seeing an attribution or license on model/model_quantized.onnx
in https://huggingface.co/rawsh/multi-qa-MiniLM-distill-onnx-L6-cos-v1
Is this ok to use in our project? Are there any constraints on us deploying it?
Overall, this looks really great. Well done. I'm planning to do another, more in depth pass this afternoon but wanted to drop this early feedback for you.
https://www.sbert.net/#citing-authors mentions
I've updated the README.md with the attributions. 02db5c6. |
Signed-off-by: John McBride <[email protected]>
Feature: Add MIT license
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Overall, looks good. Once existing comments are addressed, will merge 👍🏼
🍕 RepoQuery 🍕
A REST service to answer user-queries about public GitHub repositories
🔎 The Project
RepoQuery is an early-beta project, that uses recursive OpenAI function calling paired with semantic search using All-MiniLM-L6-V2 to index and answer user queries about public GitHub repositories.
Related Tickets & Documents
open-sauced/ai#192
open-sauced/ai#226
📬 Service Endpoints
1.
/embed
To generate and store embeddings for a GitHub repository.
Parameters
The parameters are passed as a JSON object in the request body:
owner
(string, required): The owner of the repository.name
(string, required): The name of the repository.branch
(string, required): The name of the branch.Response
The request is processed by the server and responses are sent as Server-sent events(SSE). The event stream will contain the following events with optional data.
repo-query/src/routes/events.rs
Lines 14 to 20 in f2f415a
Example
2.
/query
To perform a query on the API with a specific question related to a repository.
Parameters
The parameters are passed as a JSON object in the request body:
query
(string, required): The question or query you want to ask.repository
(object, required): Information about the repository for which you want to get the answer.owner
(string, required): The owner of the repository.name
(string, required): The name of the repository.branch
(string, required): The name of the branch.Response
The request is processed by the server and responses are sent as Server-sent events(SSE). The event stream will contain the following events with optional data.
repo-query/src/routes/events.rs
Lines 22 to 29 in f2f415a
Example
3.
/collection
To check if a repository has been indexed.
Parameters
owner
(string, required): The owner of the repository.name
(string, required): The name of the repository.branch
(string, required): The name of the branch.Response
This endpoint returns an
OK
status code if the repository has been indexed by the service.Example
curl --location 'localhost:3000/embed?owner=open-sauced&name=ai&branch=beta'
🧪 Running Locally
To run the project locally, there are a few prerequisites:
Once, the above requirements are satisfied, you can run the project like so:
Environment variables
The project requires the following environment variables to be set.
OPENAI_API_KEY
. To authenticate requests to OpenAI.Database setup
Start Docker and run the following commands to spin-up a Docker container with a QdrantDB image.
The database dashboard will be accessible at localhost:6333/dashboard, the project communicates with the DB on port
6334
.Running the project
Run the following command to install the dependencies and run the project on port
3000
.This command will build and run the project with optimizations enabled(Highly recommended).