Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixed context caching and Improved LLM prompt handling, and response processing #57

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

kshitij79
Copy link
Contributor

Closes #47 #48

Fixed context caching and Improved LLM prompt handling, and response processing

Changes

  • Updated inline templates to explicitly indicate cursor position and prevent modification of existing code.
  • Enhanced cleanSuggestion to retain the original suggestion when the cleaned suggestion is empty.
  • Fixed bug in API endpoint handling in Gemini, Mistral, and OpenAI providers.
  • Increased timeout for robustFetch to handle longer network delays.
  • Updated cache key generation to include context documents.
  • Improved logging clarity and corrected cache key generation in llmManager.

Flags

  • Needs testing.

Screenshots or Video

Related Issues

Author Checklist

  • Ensure you provide a DCO sign-off for your commits using the --signoff option of git commit.
  • Vital features and changes captured in unit and/or integration tests
  • Commits messages follow AP format
  • Extend the documentation, if necessary
  • Merging to master from fork:branchname

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Context File Caching Does Not Reflect Recent Changes
1 participant