Skip to content

feat: add SmartRouter for unified LLM routing#453

Open
tarai-dl wants to merge 1 commit intoarakoodev:tsfrom
tarai-dl:fix/286-smart-router
Open

feat: add SmartRouter for unified LLM routing#453
tarai-dl wants to merge 1 commit intoarakoodev:tsfrom
tarai-dl:fix/286-smart-router

Conversation

@tarai-dl
Copy link
Copy Markdown

Summary

Adds a SmartRouter class that works like LiteLLM - pass any model name and it auto-routes to the correct provider.

Features:

  • Auto-detection: Model name prefix determines provider (gpt-* → OpenAI, gemini-* → Gemini, llama-* → Llama)
  • Fallback chain: If primary provider fails, tries next provider in priority order
  • Extensible: Register custom providers via config
  • Unified API: Single router.chat({ model, prompt }) call regardless of backend
  • Utilities: listProviders(), isModelSupported(), detectProvider()

Provider Patterns:

Provider Model Patterns
OpenAI gpt-, o1-, o3-, chatgpt-, text-embedding-, dall-e-
Gemini gemini-, palm-, bison-*
Llama llama-, meta-llama/, mixtral-, mistral-, qwen-, deepseek-

Usage:

import { SmartRouter } from "@arakoodev/edgechains";

const router = new SmartRouter({
    openaiApiKey: process.env.OPENAI_API_KEY,
    geminiApiKey: process.env.GEMINI_API_KEY,
});

// Auto-routes to correct provider based on model name
const response = await router.chat({
    model: "gpt-4o",
    prompt: "Hello!",
});
// response = { content: "...", provider: "openai", model: "gpt-4o" }

// Fallback: if OpenAI fails, tries Gemini
const response2 = await router.chat({
    model: "gpt-4o",
    prompt: "Hello!",
});
// Falls back to Gemini if OpenAI errors

Files Added:

  • JS/edgechains/arakoodev/src/ai/src/lib/smart-router/smartRouter.ts
  • JS/edgechains/arakoodev/src/ai/src/tests/smart-router.test.ts

Files Modified:

  • JS/edgechains/arakoodev/src/ai/src/index.ts — added SmartRouter export

Closes #286

/claim #286

- SmartRouter class auto-detects provider from model name
- Supports OpenAI (gpt-*, o1-*, o3-*), Gemini (gemini-*), Llama (llama-*, mixtral-*, qwen-*, deepseek-*)
- Provider fallback chain on failure
- Custom provider registration via config
- listProviders() and isModelSupported() utilities
- Comprehensive test suite

Fixes arakoodev#286
@github-actions
Copy link
Copy Markdown

CLA Assistant Lite bot: Thank you for your submission, we really appreciate it. Before we can accept your contribution, we ask that you sign the Arakoo Contributor License Agreement. You can sign the CLA by adding a new comment to this pull request and pasting exactly the following text.


I have read the Arakoo CLA Document and I hereby sign the CLA


You can retrigger this bot by commenting recheck in this Pull Request

@tarai-dl
Copy link
Copy Markdown
Author

I have read the Arakoo CLA Document and I hereby sign the CLA

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

BOUNTY: Convert the endpoints to a smart router like litellm does in python

1 participant