Skip to content

feat(summarization): LLM-based code summaries via local OpenAI#28

Closed
lexasub wants to merge 2 commits intomainfrom
feat/summarization
Closed

feat(summarization): LLM-based code summaries via local OpenAI#28
lexasub wants to merge 2 commits intomainfrom
feat/summarization

Conversation

@lexasub
Copy link
Owner

@lexasub lexasub commented Mar 11, 2026

Description

Add LLM-powered code summarization feature (SummarizerService) that generates structured summaries of code entities (functions, methods, classes) with support for Ollama/vLLM backends.

Related Issue

Fixes #(issue number)

Type of Change

  • New feature (non-breaking change that adds functionality)

Checklist

  • My code follows the code style of this project
  • I have added tests that prove my fix/feature works
  • All new and existing tests passed (pytest tests/ -v)
  • I have updated the documentation accordingly
  • My changes generate no new warnings

Testing

# Unit tests
pytest tests/test_summarizer.py -v

# Quality evaluation
ast-rag evaluate --all

# CLI test
ast-rag summarize <function_name> --format markdown

Key Features

  • SummarizerService — Ollama/vLLM integration (OpenAI-compatible API)
  • Structured output: summary, inputs, outputs, side_effects, calls/called_by, complexity, tags
  • CLI: ast-rag summarize <name> --format {markdown,json}
  • MCP tool: summarize_code for agent workflows
  • Caching: SHA-256 code hash for cache invalidation
  • 17 unit tests covering core functionality and edge cases

Configuration

{
  "summarizer": {
    "llm": {
      "base_url": "http://192.168.2.109:1113/v1",
      "model": "qwen2.5-coder:14b"
    },
    "cache": { "enabled": true }
  }
}

@github-project-automation github-project-automation bot moved this to Backlog in raged kanban Mar 11, 2026
@lexasub lexasub moved this from Backlog to In progress in raged kanban Mar 11, 2026
@lexasub lexasub marked this pull request as ready for review March 11, 2026 11:58
Ваше Имя and others added 2 commits March 11, 2026 16:24
- Add SummarizerService with Ollama/vLLM support
- Generate structured summaries (inputs, outputs, side_effects)
- Add CLI command: ast-rag summarize
- Add MCP tool: summarize_code
- Cache summaries with SHA-256 invalidation
- Support Markdown and JSON output formats
- Add 17 unit tests

Co-authored-by: Qwen-Coder <qwen-coder@alibabacloud.com>
- Configure summarizer with http://192.168.2.109:1113/v1
- Enable summary cache in .ast_rag_summary_cache.json

Co-authored-by: Qwen-Coder <qwen-coder@alibabacloud.com>
@lexasub lexasub force-pushed the feat/summarization branch from 6ed5ea1 to 5742364 Compare March 11, 2026 12:29
@lexasub lexasub closed this Mar 11, 2026
@github-project-automation github-project-automation bot moved this from In progress to Done in raged kanban Mar 11, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

Status: Done

Development

Successfully merging this pull request may close these issues.

1 participant