feat(summarization): LLM-based code summaries via local OpenAI#28
Closed
feat(summarization): LLM-based code summaries via local OpenAI#28
Conversation
- Add SummarizerService with Ollama/vLLM support - Generate structured summaries (inputs, outputs, side_effects) - Add CLI command: ast-rag summarize - Add MCP tool: summarize_code - Cache summaries with SHA-256 invalidation - Support Markdown and JSON output formats - Add 17 unit tests Co-authored-by: Qwen-Coder <qwen-coder@alibabacloud.com>
- Configure summarizer with http://192.168.2.109:1113/v1 - Enable summary cache in .ast_rag_summary_cache.json Co-authored-by: Qwen-Coder <qwen-coder@alibabacloud.com>
6ed5ea1 to
5742364
Compare
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Description
Add LLM-powered code summarization feature (
SummarizerService) that generates structured summaries of code entities (functions, methods, classes) with support for Ollama/vLLM backends.Related Issue
Fixes #(issue number)
Type of Change
Checklist
pytest tests/ -v)Testing
Key Features
ast-rag summarize <name> --format {markdown,json}summarize_codefor agent workflowsConfiguration
{ "summarizer": { "llm": { "base_url": "http://192.168.2.109:1113/v1", "model": "qwen2.5-coder:14b" }, "cache": { "enabled": true } } }