Skip to content

[Feature]: Hybrid Ollama Integration (Local & Cloud) #159

@jmaxdev

Description

@jmaxdev

Describe the feature

Core Features

  1. Hybrid Environment Support: Seamless detection and switching between Local Ollama instances and Cloud-hosted endpoints.
  2. Usage & Billing Logic: Implementation of a robust metering system to track consumption, essential for the subscription-based model.
  3. Tiered Model Infrastructure (Cloud): * Free Tier: Access to 1 Lightweight model (e.g., Phi-3 or Llama 3 8B).
    • Premium Tier: Access to 2 High-performance models (e.g., Llama 3 70B or Mixtral).
  4. Cloud Quotas & Rate Limiting: Implementation of strict usage limits for Cloud users to ensure infrastructure stability and cost control.
  5. Identity Management: Secure User Authentication (Login/Register) and Profile Management.
  6. Localized Payment Gateway: Initial focus on the LATAM market via Mercado Pago integration, with PayPal scheduled for the global rollout phase.

Target Use Cases

  1. Resource-Constrained Hardware: Users without high-end GPUs who require Cloud compute to run LLMs effectively.
  2. IDE-Centric Workflow: Users who prefer offloading AI processing to maintain local system performance for their IDE and development tools.

Would you like to implement this feature yourself by sending a PR?

Maybe

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions