YamlPrompt is a lightweight generative pipeline that enriches YAML files by replacing selected fields with AI‑generated text or images.
Prompts can be written directly in the YAML file or dynamically constructed using other YAML fields through a simple templating syntax.
Prompts can also be built from text generated from other prompts.
YamlPrompt turns static YAML into a dynamic content generator — ideal for localization workflows, narrative generation, prototyping, and procedural content creation (e.g., games, worldbuilding, scenario design).
Marks some fields of your yaml input file with some prompt it will be replaced with:
- AI‑generated text
- or a generated image (a path to the generated image)
product:
name: "Anti-Monday Mug"
description:
prompt: >
Write an attractive description of a white ceramic mug
designed for use at work.
image:
prompt: >
A white mug featuring a minimalist printed slogan: "Not Before Coffee".
width: 512
height: 512 product:
name: Anti-Monday Mug
# /product/description prompt_hash: 335f0a37a18f49ae6a35d329f5556b5f31a6e98605ba2e6aba49e65d45ede047
description: "Introducing our sleek and sophisticated White Ceramic Mug, a perfect
addition to any modern office space. This versatile mug is meticulously crafted
for durability and style, making it an ideal companion for your daily work routine.
The pristine white finish adds a touch of elegance to any workspace, while its
robust ceramic construction ensures long-lasting performance."
image: image_product_image.pngPrompts can reference other YAML fields using a simple templating syntax. reference can be absolute or relative. Prompts can also refers other generated prompts:
book:
title: "The Grimoire"
description: >
An ancient grimoire filled with arcane secrets and forbidden sorcery.
Written during the Middle Ages by a powerful sorcerer in a dark fantasy world.
pages:
- page_text:
prompt: >
This page describes a detailed recipe for brewing a potion capable of poisoning a vampire.
Write a full page for the grimoire. Use the book description: << /book/description >>
as a background and style.
title:
prompt: >
Provide the name of the potion described in the following text:
<< ../../page_text >>YamlPrompt resolves all variables before sending the prompt to the models.
YamlPrompt only regenerates a text or image when its prompt has changed. To ensure this behavior, it stores a hash value in the comment associated with each generated field, as shown in this yaml output below:
product:
name: Anti-Monday Mug
# /product/description prompt_hash: 335f0a37a18f49ae6a35d329f5556b5f31a6e98605ba2e6aba49e65d45ede047
description: "Introducing our sleek and sophisticated White Ceramic Mug, a perfect
addition to any modern office space. This versatile mug is meticulously crafted
for durability and style, making it an ideal companion for your daily work routine."
# /product/image prompt_hash: 0311b26e761c4ba2157454e7b171f9244d37b6a8f84f2cffb27162890c2fd202
image: image_product_image.pngYamlPrompt uses Ollama/Mistral by default for text generation and Stable Diffusion for image generation. This behavior is fully extensible, as additional model providers can be integrated through a plugin mechanism.
YamlPrompt is built for workflows where YAML must remain human‑readable, stable, and predictable across generations.
By relying on the industry‑standard round‑trip parser, YamlPrompt preserves every structural and stylistic detail of your YAML:
- Key order
- Comments (inline, block, and end‑of‑line)
- Anchors & aliases
- Scalar styles and formatting
- Whitespace, indentation, and flow/block styles
This makes YamlPrompt safe for configuration files, CI/CD pipelines, IaC templates, and any environment where diffs must stay clean.
Templating is fully integrated, enabling expressive and maintainable YAML generation:
- File inclusion ({% include %})
- Context injection from CLI or Python
- Conditionals and loops
- Reusable partials and macros
You get the power of templating without sacrificing YAML round‑trip fidelity.
Use YamlPrompt:
- as a Python library
- via CLI
- inside CI/CD pipelines
- in creative or generative workflows
Automatically generate localized text based on a YAML source file.
Create character descriptions, lore, quests, dialogues, and more.
Generate:
- environments
- NPCs description
- items description
- contextual dialogs
- world descriptions
- concept images
pip install yamlpromptYou need then Ollama to be installed (as default plugin) Follow installation of Ollama here: https://docs.ollama.com/quickstart
from yamlprompt import YamlPrompt
from yamlprompt.plugins import PluginManager
yp = YamlPrompt(image_engine="stablediffusion_image",
text_engine="ollama_text/mistral")
output, ok = yp.generate("input.yaml", "output.yaml")Commands for the llm generator:
prompt: A user prompt text # Prompt for the llmCommand for the Image generator:
prompt: A user prompt text # Prompt for image generator
width: 512 # Width of the image
height: 512 # height of the imageroot:
leaf1: This is leaf 1
leaf2: leaf1 text is <../leaf1> # Relative reference to leaf1
leaf3: leaf1 text is </root/leaf1> # Absolute reference to leaf1
YamlPrompt can be used directly from the terminal to enrich a YAML file with AI‑generated text or images.
yamlprompt input.yaml output.yamlThis command:
- reads input.yaml
- reads output.yaml if it exists, to skip regenerating text or images from prompts that haven’t changed
- processes all AI‑annotated fields using the configured engines
- writes the final result to output.yaml
yamlprompt input.yaml output.yaml --debugShows unresolved variables and internal processing details.
yamlprompt input.yaml output.yaml --image_dir ./imagesStable Diffusion outputs will be stored in this directory.
yamlprompt input.yaml output.yaml --inline_promptsInline prompt texts in the output YAML as a comment in the output file so that you can investigate what has been used as prompt to generate text or images.
yamlprompt input.yaml output.yaml --text_engine=ollama_text/deepseek-r1Select the text engine. default is ollama_text/mistral
The default and only plugin available at the moment is Ollama. Ollama can then use one of the llm listed here: https://ollama.com/library
Follow installation of Ollama here: https://docs.ollama.com/quickstart
yamlprompt input.yaml output.yaml --image_engine=stablediffusion_imageSelect the text engine. Default is stablediffusion_image.
yamlprompt examples/character.yaml results/character_out.yaml \
--image_dir generated_images \
--debug0 — generation completed successfully
1 — unresolved variables remain in the output YAML
Contributions are welcome! Feel free to open issues, submit PRs, or suggest new features.
MIT
