Blender's Python API can automate anything. Its documentation is improving but still lacks practical examples. These recipes fix that.
Blender ships with a powerful Python API called bpy. In theory, you can script every operation the GUI offers: modeling, animation, rendering, compositing, physics simulations, and more. In practice, while the official documentation has improved in Blender 4.x with better descriptions and some examples, it still largely consists of auto-generated references that lack practical workflow examples, context, and guidance on common pitfalls.
This cookbook is a collection of practical, copy-paste-ready Python recipes that solve real Blender automation tasks. Every function includes type hints, docstrings, and can be tested outside Blender using mocked imports. Whether you are a technical artist automating a pipeline, a game developer batch-exporting assets, or a machine learning engineer generating synthetic training data, these recipes give you working code instead of hours of forum archaeology.
Open Blender's scripting workspace (or run Blender with --python), and paste:
import bpy
bpy.ops.mesh.primitive_cube_add(size=2, location=(0, 0, 1))
obj = bpy.context.active_object
obj.name = "MyCube"
print(f"Created {obj.name} at {tuple(obj.location)}")That is the core pattern: call an operator, grab the result from context, configure it. Every recipe in this cookbook follows this pattern and wraps it in a clean, reusable function.
Caveat:
bpy.context.active_objectis fragile -- it returns whatever object is currently active, which may not be the one the operator just created if another script or handler changed the selection between the operator call and the context read. In production pipelines, prefer accessing the created object throughbpy.data.objectsby name, or capture the selection immediately after the operator with no intervening calls.
The recipes are organized into five progressive chapters, each in its own directory under recipes/.
The foundation. Create primitive objects (cubes, spheres, cylinders, planes), apply materials with Principled BSDF or emission shaders, and perform transforms (move, rotate, scale, duplicate). If you are new to bpy scripting, start here.
Key recipes:
create_objects.py-- spawn any primitive with one function callmaterials.py-- create diffuse and emission materials, apply them to objectstransforms.py-- move, rotate, scale, and duplicate objects programmatically
Control the scene beyond individual objects. Set up cameras with tracking constraints, add sun lights, load HDRI environment maps, configure render resolution, organize objects into collections, and import/export files in OBJ, FBX, and glTF formats.
Key recipes:
scene_setup.py-- camera placement, sun lights, HDRI loading, resolutioncollections.py-- create collections, move objects between themimport_export.py-- OBJ, FBX, and GLB import/export wrappers
Keyframe animation made simple. Insert location and rotation keyframes at specific frames, create bounce animations, render single frames or full animation sequences, and switch between EEVEE and Cycles render engines.
Key recipes:
keyframes.py-- location/rotation keyframes, bounce animation generatorrender_animation.py-- render frames, animations, engine setup
Scale your workflows. Render multiple scene configurations in a loop, generate color variations of an object, build headless rendering commands for CI/CD pipelines, and create batch render scripts that run inside Blender without the GUI.
Key recipes:
batch_render.py-- batch scene rendering, color variation generationheadless_render.py-- CLI command builder, batch script generator
Generate synthetic datasets for machine learning. Create randomized scenes with scattered objects, randomize camera angles and lighting conditions, and render thousands of labeled images for training computer vision models.
Key recipes:
synthetic_dataset.py-- full dataset generation pipeline with randomizationrandom_scene.py-- random scene creation, object scattering
Batch rendering is one of the most valuable automation targets in Blender. Common use cases include:
- Game asset pipelines: Render turntable previews of hundreds of models overnight. Export each model to FBX/GLB with standardized settings, then render preview images for an asset catalog.
- Architectural visualization: Generate multiple lighting conditions (morning, noon, sunset, overcast) for the same scene without manual setup.
- Product visualization: Render a product in every available color, material finish, and camera angle for an e-commerce catalog.
- Animation studios: Distribute frame ranges across render farm nodes using headless rendering commands.
The batch_render.py and headless_render.py recipes provide the building blocks for all of these workflows.
Blender can run entirely from the command line without opening a window. This is essential for server-side rendering, CI/CD pipelines, and render farms. The basic pattern is:
blender -b scene.blend --python script.py -o /output/frame_ -f 1The -b flag enables background (headless) mode. The headless_render.py recipe wraps this into a Python function that builds the command for you, making it easy to integrate Blender rendering into larger automation scripts or Docker containers.
Training computer vision models requires massive labeled datasets. Blender can generate these synthetically: create 3D objects, randomize their appearance and the scene conditions, render images, and export bounding boxes or segmentation masks.
The synthetic_dataset.py recipe demonstrates the core loop:
- Clear the scene
- Create a random object from a set of primitives
- Randomize camera angle (spherical coordinates)
- Randomize lighting energy
- Render and save
By varying object types, materials, backgrounds, camera positions, and lighting, you can generate tens of thousands of diverse training images without any manual annotation. This approach is used in robotics (sim-to-real transfer), autonomous driving (synthetic traffic scenes), and retail (product recognition).
These recipes do not yet cover two major Blender 4.x features:
- Geometry Nodes via Python -- Blender 4.x greatly expanded Geometry Nodes, which can be created and wired programmatically through
bpy.data.node_groupsand theGeometryNodeTreetype. Recipes for procedural geometry node setups are planned. - EEVEE Next -- Blender 4.0 replaced the legacy EEVEE engine with EEVEE Next (
BLENDER_EEVEE_NEXT). The render recipes in this cookbook use the updated engine identifier. If you are using Blender 3.x, changeBLENDER_EEVEE_NEXTtoBLENDER_EEVEE.
- Blender 4.x (4.0 or newer recommended; the current LTS is Blender 4.2)
- Python 3.11+ (bundled with Blender 4.x)
- For running tests outside Blender:
pip install -e ".[dev]"
The recipes use Blender's bundled Python interpreter. You do not need to install bpy separately. The TYPE_CHECKING import pattern ensures all recipe files are importable and lintable in a standard Python environment, while the actual import bpy happens at runtime inside Blender.
Tests mock the entire bpy module so you can validate recipe logic without a Blender installation:
pip install -e ".[dev]"
pytest tests/ -vLinting and type checking:
ruff check recipes/ utils/ tests/
mypy recipes/ utils/ --ignore-missing-importsblender-python-cookbook/
recipes/
basics/ -- primitives, materials, transforms
scene_management/ -- camera, lights, HDRI, collections, import/export
animation/ -- keyframes, render animation
batch_processing/ -- batch render, headless CLI
data_generation/ -- synthetic datasets, random scenes
utils/
blender_utils.py -- shared helpers (clear scene, get object, set engine)
tests/
test_recipes.py -- mocked bpy tests for core recipes
test_headless.py -- tests for headless rendering utilities
- Blender -- Blender source code
- fake-bpy-module -- Type stubs for bpy, useful for IDE autocompletion
- BlenderProc -- Procedural Blender pipeline for synthetic data generation
- bpy-build -- Build bpy as a standalone Python module
MIT License. See LICENSE for details.