Skip to content

knowing-machine/some-other-knowing-machine

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Knowing Machine

The machine that knows the answers to all questions, using Interlaced Thinking (Kimi-k2) and a local knowledge base.

Features

  • Interlaced Thinking: Uses the Moonshot Kimi-k2-thinking model to reason about problems.
  • Local Knowledge Base: Integrates with a local LLM (e.g., via Ollama/LocalAI) to query private data.
  • Rich Terminal UI: Displays the thinking process and final answer beautifully in the terminal.

Installation

pip install .

Usage

Prerequisites

  1. Moonshot API Key: Set MOONSHOT_API_KEY environment variable.
  2. Local LLM: Ensure a local LLM is running (e.g., Ollama on port 11434).

CLI

Run the machine from the command line:

python -m knowing_machine "What is the secret contained in my local notes?" --kb-model llama3

Python API

import asyncio
from knowing_machine import KimiClient, LocalLMKnowledgeBaseTool, InterlacedThinker

async def main():
    client = KimiClient() # Reads MOONSHOT_API_KEY env var
    kb = LocalLMKnowledgeBaseTool(model="llama3")
    
    machine = InterlacedThinker(client, [kb])
    await machine.think("Your complex query here")

if __name__ == "__main__":
    asyncio.run(main())

Configuration

  • MOONSHOT_API_KEY: Required.
  • Local Tool default base URL: http://localhost:11434/v1 (Compatible with Ollama).

About

Knowing machine

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages