Skip to content

Convert dependencies to uv. Add API/MCP server #129

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 4 commits into from
Apr 27, 2025
Merged

Conversation

snexus
Copy link
Owner

@snexus snexus commented Apr 27, 2025

  • Update dependencies and convert the project to uv based package management. As a result, requirements.txt was deleted.
  • Remove support for llama-cpp, as the compatible version became obsolete and llamacpp can be used as a standalone package, utilizing OpenAI compatible connection.
  • Introduce MCP server for semantic search and RAG answer operations, allowing compatibility with any MCP client

@snexus snexus requested a review from Copilot April 27, 2025 10:04
Copy link

@Copilot Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR updates package dependencies to uv‑based management, removes support for llama‑cpp, and introduces an MCP server for semantic search and RAG answer operations.

  • Dependency upgrades and configuration updates
  • Removal of llama‑cpp support and introduction of MCP server
  • Update of configuration and cache handling throughout the codebase

Reviewed Changes

Copilot reviewed 13 out of 19 changed files in this pull request and generated no comments.

Show a summary per file
File Description
src/llmsearch/webapp.py Updated import types, configuration variable renaming, and added a workaround for torch classes; potential naming and comment typo issues noted.
src/llmsearch/utils.py Refactored import statements and minor formatting changes.
src/llmsearch/process.py Minor formatting and parameter adjustments.
src/llmsearch/models/utils.py Commented out references to deprecated LlamaModel.
src/llmsearch/config.py Improved YAML file loading with explicit encoding.
src/llmsearch/api.py Integrated FastAPI MCP server and updated dependency injections.
sample_templates/*.yaml Updated configuration values for embeddings and model names.
pyproject.toml Adjusted Python version requirement and dependency definitions.
README.md Updated documentation to reflect new features and enhancements.
Files not reviewed (6)
  • .flake8: Language not supported
  • docker/Dockerfile: Language not supported
  • docker/entrypoint.sh: Language not supported
  • docs/installation.rst: Language not supported
  • docs/usage.rst: Language not supported
  • sample_templates/llm/llamacpp.yaml: Language not supported
Comments suppressed due to low confidence (2)

src/llmsearch/webapp.py:88

  • The function name 'udpate_index' appears to be misspelled; consider renaming it to 'update_index' for clarity and consistency.
def udpate_index(doc_config_path: str, model_config_file):

src/llmsearch/webapp.py:190

  • There is a typo in the comment: 'paratemeters' should be corrected to 'parameters'.
# _config and _bundle are under scored so paratemeters aren't hashed

@snexus snexus merged commit fd3466b into main Apr 27, 2025
2 checks passed
@snexus snexus deleted the feature/bump-version branch April 27, 2025 10:14
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants