Understand how your brand shows up in LLM responses.
-
Updated
Mar 25, 2026 - TypeScript
Understand how your brand shows up in LLM responses.
Langfuse MCP server with built-in analytics. 34 tools — traces, observations, sessions, scores, prompts, datasets + accuracy metrics, failure detection, token percentiles, cost breakdowns, latency analysis, context breach scanning. Works with Claude Code, Cursor, Codex.
Python SDK for Agent AI Observability, Monitoring and Evaluation Framework. Includes features like AI Agent, LLM and tools tracing, debugging multi-agentic system, self-hosted dashboards and advanced analytics with timeline and execution graph view.
Campus ops intelligence platform turning nearly 1.8k survey responses into cited policy insights: ETL, metrics, NLP clustering, pgvector RAG, and a Next.js dashboard.
Add a description, image, and links to the llm-analytics topic page so that developers can more easily learn about it.
To associate your repository with the llm-analytics topic, visit your repo's landing page and select "manage topics."