Blender Command Line Rendering



Blender is a tremendously useful open source 3D modeling program, and can be run on the HPC to create high fidelity visualizations of data and general 3D modeling. One of the benefits of running Blender on the HPC is you're able to transfer in a blender file containing your scene already configured and you can start the rendering process and leave it running overnight or during the week depending on the size of the rendering that needs to be done. Using command line rendering makes it possible to render from blender in a headless fashion, meaning no GPU is has to be allocated for your rendering.  Another benefit of rendering on the supercomputer is large single frame Renderings can take up a significant amount of computer memory which HPC environments have to spare.

OOD Remote Desktop Session

Start up Open On Demand by going to or reading up about OOD within our documentation

Enter details of your request. Here we are requesting a 5 hour 16 core interactive session on elgato. Important to note that your own PI Group must be entered instead of visteam 

when the request clears the allocation queue you will be able to launch your interactive desktop

Download and Configure Blender

In this section we will download Blender, and unzip it, and then write a rendering python script

At the remote desktop start screen open a terminal

Navigate to an existing folder where you have space

Download blender with this wget  command from their clarkson mirror

wget ""

 This lets us download the Long Term Support version of blender from their homepage

Also make sure to uncompress the tar.xz  at the end using this command

tar -xf blender-2.93.8-linux-x64.tar.xz

Here is the code that we will put into the  file
#get the blender python library
import bpy
#set the scene 
scene = bpy.context.scene
# set the output format as .tif
scene.render.image_settings.file_format = "TIFF"
# specify where the rendered image/s will go and what their names should be
scene.render.filepath = "./frames/render"
# set the engine to high performance CYCLES renderer
scene.render.engine = "CYCLES"
# set the resolution percentage down for testing, turn this up to 100 when it's worked once
scene.render.resolution_percentage = 25
# how many cpu threads we should create, this is a good default for Elgato, but should be higher on Puma and Ocelote
# set it to the number of CPU cores you have in your allocation
scene.render.threads = 15
scene.render.threads_mode = "FIXED"
# write a still frame render of the scene

Running the Blender Scene Render from the Command Line

Once you have created/edited the  using a command line editor like vi, use this command to start the headless rendering

This will ensure that the blender file you have configured gets run in the background ( -b  ) and that the  script is used as a python ( -P  ) script 

blender-2.93.8-linux-x64/blender -b <blender file here> -P

With any luck your terminal will output that the process is initializing. This scene renders a landscape mesh of flagstaff using a heightmap in a tiff format which is why we see messages like TIFFFetchNormalTag , which shouldn't appear on your screen unless you are doing a similar task.

This output shows the initialization of other blender systems that may or may not be part of the rendering that you are doing. Unused systems are unlikely to detract from the rendering performance and are simply listed for diagnostic purposes it seems.

Once the initialization has completed the individual tiles of the larger image will begin rendering. This is where the massive multicore environments can really shine because a rendering thread can be dispatched for each core provided there's enough memory to support all of them running at the same time.

There is also a time estimate which is usually an over estimate for the full duration of the rendering task.

If you want to make sure that all the cpu cores you have allocated are in use use the htop -u <username>  in a new terminal tab.

When the process completes you will see blender quit after saving out an image to the folder you specify

Viewing Your Results

This is a view of the image produced by the workflow