Skip to content

#259- Refactor: Externalize LLM system prompt to configuration file#260

Open
Cubix33 wants to merge 1 commit intofireform-core:mainfrom
Cubix33:decouple-prompt-and-code
Open

#259- Refactor: Externalize LLM system prompt to configuration file#260
Cubix33 wants to merge 1 commit intofireform-core:mainfrom
Cubix33:decouple-prompt-and-code

Conversation

@Cubix33
Copy link

@Cubix33 Cubix33 commented Mar 16, 2026

Closes #259

📝 Description

Decouples the LLM system instructions from the core application logic. This allows users to easily customize AI behavior (e.g., enforcing date formats or adding local jargon) without having to edit the core Python source code.

🛠️ Changes Made

  • Extracted the hardcoded SYSTEM PROMPT string from src/llm.py.
  • Created a new src/prompt.txt file to store the base instructions.
  • Updated LLM.__init__() to dynamically load prompt.txt at runtime using os.path.
  • Updated build_prompt() to inject the loaded configuration into the Ollama payload.

🧪 How to Test

  1. Open src/prompt.txt and append a test rule (e.g., "Always return dates in YYYY-MM-DD format").
  2. Run the application (e.g., make exec).
  3. Verify the LLM respects the new rule without requiring any modifications to llm.py.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[FEAT]: Externalize LLM System Prompt to a Text File

1 participant