Tissue cartography using Blender
Summary
Tissue cartography is the art of projecting a tridimensional surface, usually of a biological sample, into a 2D surface. This is useful to visualize and analyze complex 3D microscopy data.
The state-of-the-art approach to generate cartographic projections was the MATLAB-based ImSAnE toolbox (Heemskerk and Streichan 2015). However, more recently, a Blender-based tool was released, known as Blender Tissue Cartography (Claussen et al. 2025) (code repository). This tool has an extensive documentation and several in-depth tutorials that I highly recommend anyone interested in tissue cartography to read.
This tutorial is a simplified version of the Blender Tissue Cartography tutorials. It is targeted to beginners wanting to learn the basics of tissue cartography, focusing on a single approach. It shows how to generate cartographic projections from 3D lightsheet microscopy data using Fiji to inspect data, ilastik to segment the tissues, and Blender with the Blender Tissue Cartography add-on to create the projections.
If you want to generate cartographic projections using the MATLAB-based ImSAnE toolbox (Heemskerk and Streichan 2015), please check the imsane-tutorial which explains how to set up and run the pipeline (Vellutini 2022).
Requirements
- Fiji (Schindelin et al. 2012)
- ilastik v1.4.1.post1 (Berg et al. 2019)
- Blender v4.2.9 (Blender Foundation 2002)
- Blender Tissue Cartography (Blender add-on) (Claussen et al. 2025)
Drosophila_CAAX-mCherry.tifdataset from Blender Tissue Cartography (available here) (Claussen et al. 2025)
Setup
Download Blender Tissue Cartography
- Go to https://github.com/nikolas-claussen/blender-tissue-cartography.
- Press the green button named
Code>Download ZIPto begin the download (or press here). - Unzip the contents in your working directory.
- You should see a new directory named
blender-tissue-cartography. - Copy the file
Drosophila_CAAX-mCherry.tiflocated atblender-tissue-cartography/nbs/Tutorials/drosophila_example/to your working directory.
Download Blender
- Go to https://download.blender.org/release/Blender4.2/.
- Download Blender 4.2.9 (direct link for Linux/MacOS/Windows).
- Unzip the file into your working directory.
- You should see a new directory named
blender-4.2.9-linux-x64(or similar for other systems).
Note that older or newer versions of Blender might not work. Use v4.2.9 as it is known to work with this tutorial.
Install Blender Tissue Cartography
- Open the directory
blender-4.2.9-linux-x64. - Double-click the file
blenderto open the program. - Go to
Edit>Preferences>Add-ons>Add-ons Settings(down arrow) >Install from Disk.... - Select the file
blender_tissue_cartography-1.0.0-linux_x64.ziplocated in the directoryblender-tissue-cartography/blender_addon/. - Close Blender.
Download ilastik
- Go to https://www.ilastik.org/download.
- Download ilastik 1.4.1.post1 (direct link for Linux/MacOS/Windows).
- Unzip the file into your working directory.
- You should see a new directory named
ilastik-1.4.1.post1-Linux.
Download Fiji
- Go to https://fiji.sc.
- Choose
Distribution: Stablethen click the big download button. - Unzip the file into your working directory.
- You should see a new directory named
fiji-stable-linux64-jdk.
Inspect dataset in Fiji
Before starting, let’s inspect the Drosophila_CAAX-mCherry.tif dataset in Fiji.
- Open the directory
fiji-stable-linux64-jdk/Fiji.app/and double-click thefiji-linux-x64launcher.
- Drag and drop
Drosophila_CAAX-mCherry.tifin the Fiji window to open it. - Scroll through the Z slices of the stack.
- To get more information, activate the orthogonal views with
Image>Stacks>Orthogonal Views(orCtrl+Shift+H).
Explore the sample to understand well its shape. Try to figure out which side of the embryo is dorsal, which is ventral, and what is left/right. Also notice what are the characteristics of the tissues and of the background regions and think about the potential issues we might encounter with this dataset.
- Once done, close the orthogonal views and stack (leave Fiji open).
Segment tissues in ilastik
The first step we need is to segment the stack in 3D to distinguish what is tissue and what is background. This is required to create a 3D mesh in Blender that has the shape of the sample. To accomplish that, we will use ilastik.
- Open the directory
ilastik-1.4.1.post1-Linux, right-click the filerun_ilastik.sh, and selectRun as a Programto open ilastik.
Create project
- Maximize the interface (we will need it).
- Under
Create New Project, click onPixel Classification.
The window Create Ilastik Project will open.
- Navigate to your working directory and click
Save.
A file named MyProject.ilp will be created.
Input Data
The ilastik interface is ready to define our input data.
- Under the
Raw Datatab, click onAdd New...>Add separate Image(s)....
- Then, select the file
Drosophila_CAAX-mCherry.tif.
ilastik will open the dataset in three orthogonal views: XY (blue), XZ (green), YZ (red).
Note, however, that the images are too dark; let’s fix this.
- Right-click the dataset row and select
Edit properties.... - Change the value of
Normalize DisplaytoTrueand set theRangemaximum value to10000. - Press
OK.
The contrast will be much better now.
Feature Selection
Next we need to select the image features to take into account for the segmentation.
- On the left column press
2. Feature Selection; the interface will update. - Then press
Select Features...to open theFeatureswindow. - Select all the features and press
OK.
Training
- On the left column, click
3. Training.
A new toolbox will open underneath showing two labels, Label 1 (yellow) and Label 2 (blue), buttons for paint or eraser modes, a size drop-down menu, and a Live Update button.
Now is a good time to get familiar with the basic ilastik commands.
Scroll forward: go down through the slices of the orthogonal dimension.Scroll backward: go up through the slices of the orthogonal dimension.Ctrl+Scroll forward: zoom in.Ctrl+Scroll backward: zoom out.Middle-click and hold: drag view around.
Learn how to zoom in/out, go through the slices, and drag the view. If we are able to zoom in significantly, and reach the top or bottom of a view by dragging, we are ready for painting.
Our goal is to paint tissues in yellow and background in blue. But to accomplish that, we only need a few strokes at the right regions of the image.
- Begin by zooming in at the top region of the XZ (green) view.
- Select Label 2 (blue), change the size to
7, and paint a line right above the tissue. - Now select Label 1 (yellow) and paint the tissue immediately below the blue line.
These two simple lines are indicating to ilastik that all the image features in this region very close to the tissue correspond to “background”, and that the image features of the tissue below correspond to “foreground”. Putting the two lines adjacent to each other also helps ilastik to understand where the boundaries are.
- Since we want all of the tissue to be segmented (and not just the surface), also paint a line until the center of the embryo.
This is enough to get started. Based on these simple strokes, ilastik will learn and apply this to the entire dataset.
- To start the training, press the button
Live Update.
ilastik will overlay the current segmentation model over the image. We should see that most of the tissue regions are yellow and that the region around the embryo is more blue. The more vivid the color, the more confident the model is about that specific region.
Now start painting with simple strokes areas which are wrong or pale. For example the corners of the images are background and should be blue; any area inside the sample should be yellow; use different brush sizes if needed; or the eraser. This sample also has giant, super bright beads; they are not tissue, we want them blue.
Note that the segmentation model and overlay colors update upon each stroke, so we can see if what we did improved or worsened the segmentation. If it got worse, we can always erase the annotation.
Be meticulous and pay special attention to the edges of the image. We do not want tissue (yellow) to be touching the border, because this will create a hole in the segmentation. The better the segmentation is, the better will be our visualization and cartographic projection.
There are ways that we can fix segmentation issues after converting it to a mesh, but they will not be covered in this tutorial. So, for this image, it is important to take care of the tip of the very top part of the sample because it is touching the edge.
- Use a size
1brush to place a couple of blue lines at the very top.
- Go through the slices in each of the orthogonal views to fix any leftover segmentation uncertainties.
The segmentation overlay should be showing clearly separated yellow and blue regions that match the embryo and background.
Prediction Export
We can now export the segmentation prediction.
- On the left column, click on
4. Prediction Export. - Under
Export Settingskeep theSourcevalue asProbabilities.
- Then, press
Choose Export Image Settings...to open theImage Export Optionswindow.
There are two options that we need to change.
- Under
Cutout Subregionuncheck the rowc(for channels) and change the stop value to1.
Since this image has only one channel, changing this option avoids loading a duplicated channel into Blender.
- Under
Output File Infochange the value ofFormattomultipage tiff.
This is required to be able to load the segmentation into Blender.
- Press
OKto close the window. - Then press the
Export Allbutton and wait…
When the prediction is done, a new file will appear in the working directory named Drosophila_CAAX-mCherry_Probabilities.tiff.
- Open this file in Fiji to see how it looks before our next step in the tutorial.
- Now close the file and let’s start with generating the actual projection.
Import data to Blender
We can now import the image stack and segmentation probabilities into Blender.
Open Blender
- Enter the directory
blender-4.2.9-linux-x64and double-click the fileblender.
Blender will open with a nice splash screen at the center.
- Click anywhere to close it.
Notice at the top left region that we are in the Layout tab (important for later). There’s a gray cube at the center, we want to get rid of it.
- In the top right panel under
Scene Collection>Collection, right-click theCubeline and selectDelete.
Great. Let’s focus now at the bottom right corner, it is busy, full of icons and menus. Don’t get overwhelmed, we only need to select and use one of the modes.
- If not yet selected, click on the
Sceneicon (it is the white triangle with two circles) to activate this panel. - Then locate the tab named
Tissue Cartographyat the bottom.
- Scroll down and make the side panels wider to be able to read the options of
Tissue Cartography.
This is the main interface of the Blender Tissue Cartography add-on. It is through here that we will control most of the steps of this pipeline.
Load sample
The first thing we need to do is to load the sample.
- Click on the folder icon in the
Tissue Cartography>File Path, navigate to the working directory, and select the fileDrosophila_CAAX-mCherry.tiff.
Tip: bookmark the directory for easy access in the future.
- Click
Accept.
- Then press
Load .tiff fileto load it into Blender.
A new row will appear at the top right panel under Scene Collection > Collection named Drosophila_CAAX-mCherry_BoundingBox and the bounding box of the image stack will be visible in the main window as orange lines.
Only a portion of the bounding box is visible, but we want to see the whole thing. The controls to navigate the 3D space are at the top right corner of the main window. We have XYZ handles (red, green, blue), a zoom tool (magnifier lens), and a move tool (open hand).
- Click, hold, and drag any of these to move around.
- Clicking on the X, Y, or Z will reorient the sample along these axes (very useful).
Take some time to practice and finish by placing the bounding box at the center of the main window as in the image below.
Load probabilities
Now let’s load the probabilities files.
- Click on the folder icon of the
Tissue Cartography>Segmentation File Path, navigate to the working directory, and select the fileDrosophila_CAAX-mCherry_Probabilities.tiff. - Click
Accept.
The file name will appear in the field, but before loading we want to adjust one parameter. Blender will take the segmentation probabilities and convert it into a tridimensional mesh. Generally, the raw segmentation is full of sharp angles, which will not look very nice when we map the image information onto the mesh for visualization. Therefore, it is generally a good idea to apply a degree of smoothing upon importing the segmentation.
We can control the smoothing in the small field below and to the right of Segmentation File Path named S... 0.00. It should read Smoothing (µm) but the panel is too narrow to show the full name.
- Click on it and set it to
1.0.
- Now click on
Get mesh(es) from binary segmentation .tiff file(s)to generate the mesh (it takes a second).
A gray mesh shaped like our sample will appear inside the bounding box in the main window. Also notice that a new row appeared at the top right panel under Scene Collection > Collection named Drosophila_CAAX-mCherry_Probabilities_c0.
Congratulations! We have successfully generated a 3D mesh of the sample. That’s already a powerful visualization method. Celebrate by exploring the sample. Rotate all around, zoom in to see details, and check how good the mesh is. Are there any holes or other artifacts?
Apply shading
The mesh is nice, but it would be even better to see the actual image data overlaid on the mesh. We can accomplish that using the shading view and function of Blender.
- First, we need to activate the Shading workspace at the top right corner of the main window (an icon that looks like a pie chart).
Once clicked, the mesh will become almost white.
Before applying the shading, there’s one important parameter to set: Vertex ... 0.00 or, in full, Vertex Shader Normal Offset (µm).
When this parameter is 0, the image data that corresponds to the limits of the segmentation is applied onto the mesh. In this case, this is the surface of the sample which, in this case, does not have much information. The fluorescent signal of the tissue, in this case, is a few microns deeper. Therefore, we can use the offset parameter to adjust the exact layer to be applied to the mesh as shading.
- In this case, a value of
5works well.
Every time we want to apply or refresh the shading, we need to select the bounding box and probabilities entries in the Scene Collection > Collection top right panel.
- We can do so by clicking on one and
Ctrl+Clickon the other to select both.
- Finally, press
Initialize/refresh vertex shadingto apply the shading and wait…
After a few seconds, we should see cell membranes overlaid onto the mesh.
- Take the chance to explore the sample again, now with some biological information projected into 3D!
Generate projection
We are ready to generate our first cartographic projection. This is done in the UV Editing workspace.
- Click on the
UV Editingtab to activate it.
The workspace will be divided in two. On the left we will see the projected mesh and on the right it is the mesh of the original sample in 3D.
Set up UV Editing
Before starting, we need to make sure that the correct options are enabled.
- First, zoom out the right side window for the entire sample to be visible.
- Then, disable the bounding box entry in the top right panel under
Scene Collection>CollectionbyCtrl+Clickon the square nodes symbol on the left of the bounding box entry as shown below.
- Next, change the
Select ModefromVertextoFace.
In the right hand workspace, we should see the sample (not the bounding box) highlighted in light orange.
We might accidentally click somewhere and the mesh will become black.
If this happens, worry not!
- Simply press
Aor click onSelect>Allto select all the mesh.
To finish the setup, we need to orient the sample properly for the projection, it matters!
- Click on the Y axis handle until the sample is oriented vertically with the narrower tip pointing upward.
We are ready to project the mesh.
Project mesh
While the right side shows the sample, the left side shows how the projected mesh looks like; it is initially empty.
- To make the first projection, go to
UV>Cylinder Projectionto project the mesh over the curved wall of a cylinder.
A crazy, palisade-like wall will appear.
But don’t despair.
- Click on the tiny menu named
Cylinder Projectionthat appeared at the bottom of the workspace.
The options for the cylinder projection will appear. There we can define the orientation of the axes and other options to change how the 3D mesh is transformed into a 2D surface. What we need for now is to contain the projection into the squared bounding area.
- Activate the checkbox
Scale to Bounds.
This will nicely limit the mesh to the projection area.
That’s it. We have our first projected mesh. How good is it? Ideally, the mesh should occupy the entire projection area. Our projected mesh has a couple of portions slightly bulging outside the area, and we have an empty vertical portion on the right side. This could be fixed with some editing. However, for now, it looks good enough for a first try.
Project data
The next step is to project the actual image data onto this projected mesh surface.
- Change back to the
Layoutworkspace, select bothDrosophila_CAAX-mCherry_BoundingBoxandDrosophila_CAAX-mCherry_Probabilities_c0underScene Collection>Collection, and change the optionNormal Offs...(Normal Offsets (µm)) to5to match theVertex Shader Normal Offset (µm)option.
Note: Normal Offsets (µm) accepts a comma-separated list of values. We can put 0,1,2,3,4,5 to generate a projection with 6 slices representing onion layers deeper into the tissues.
- Then, click on
Create Projection.
Wait… the interface might become unresponsive. If a dialog appears, click on Wait and wait. When done, the shading over the sample will blink. But, it should look very similar to how it was before (and if it doesn’t, it is a sign that something went wrong).
Check projection
- To visually inspect the projected data, change back to the
UV Editingworkspace.
- Zoom out on the left side window to see the entire projection.
- Then click anywhere outside the sample or projection to unselect the mesh.
The sample mesh will become visible in black, and the cartographic projection should appear on the left side window. The orientation of the sample will match that of the projection (if the sample is upside down when projecting the mesh, the projected mesh will also be upside down).
So, what happened here? We projected the sample mesh to 2D using the cylinder approach. Then the add-on Blender Tissue Cartography used this projected mesh to create a projection of the image data from the original stack. Quite nice!
Every time we create a new projection, the projected data is stored as an image which is available for Blender to display as an image “data-block”. We can see them by clicking on the picture icon in the top menu.
This first projection is named Channel_0_Layer_0. The next will be named Channel_0_Layer_0.001, Channel_0_Layer_0.002, and so on.
Save projection
The projection now exists in Blender, but we need to export it to file.
- For that, go back to the
Layoutworkspace, select bothDrosophila_CAAX-mCherry_BoundingBoxandDrosophila_CAAX-mCherry_Probabilities_c0underScene Collection>Collectionagain, then click onSave Projection.
A Blender File View window will open.
- Navigate to the working directory.
- In the file name field put the name of the file with
_cylinderappended to it, to readDrosophila_CAAX-mCherry_cylinder.tif. - Then press
Save Projection.
Three new files will appear in the working directory with BakedData, BakedNormals, and BakedPositions suffixes appended to the dataset filename.
BakedDatashows the original image data projected on the surface.BakedPositionsshows the original XYZ positions projected on the surface in RGB.BakedNormalsshows the XYZ surface directions perpendicular to each point in RGB.
Open projection
Let’s open the files in Fiji for inspection.
- Drag and drop the files into Fiji.
Explore them in more detail, check with Image > Color > Channels Tool... (Ctrl+Shift+Z) how the individual channels look like. These files provide important information to reconstruct back the 3D information from the projected surface in downstream analyses.
Optimize projection
Our initial projection is satisfactory, but there are many ways to optimize it to our specific needs. One immediate thing is to try a Sphere Projection instead of Cylinder Projection. They are quite similar, but I found the sphere projection to be more consistent and predictable and might work better for more spherical samples.
Another common use case is to be able to determine where the mesh will unwrap. This might be important for downstream analyses. In our sample, for example, the projection put the dorsal side on the left side (where a clump of germ cells are visible at the bottom) and the ventral side on the right side. However, let’s say that I need for my analyses the dorsal side at the center of the projection.
To accomplish that we can manually mark a seam on our mesh to define the unwrapping position.
Mark seam on mesh
- Go to the
UV Editingworkspace and change theSelect ModetoEdge select.
In this mode, when we click on the mesh an edge is selected and when we subsequently Ctrl+Click on another edge, the shortest path between the two edges will be selected. Like this we can quickly select a line along the sample to mark the unwrapping position. What we want is to trace a line through the ventral side of the sample. This is the region opposite to the germ cell clump.
- Using the handle buttons, reorient the sample sideways with the ventral side facing us.
We will start by selecting an edge on one of the poles.
- Turn the sample to show one of the poles and zoom in to see the edges clearly.
- Then click on one edge at the center of the pole to select it.
- Zoom out slightly, turn the sample, and
Ctrl+Clickon another edge further away from the pole. - A yellow line will appear connecting the pole with the current edge.
- Continue
Ctrl+Clickon edges along the sample until the opposite pole (don’t worry if the line isn’t perfectly straight).
- Now press
Ctrl+Eto open a menu with edge options and selectMark Seam.
The seam will be marked in red (below it’s mixed with yellow line from the edge selection).
Project mesh with seam
- Now reorient the sample vertically again with the narrow tip up and change the
Select Modeback toFace select.
- Press
Ato select all the mesh (they will turn orange). - Go to
UV>Cylinder Projection. - Then check
Preserve Seamsin the option box.
As we can see, the projection changed. It is squeezed at the center of the projection area due to the very long protruding mesh at the top left and bottom right regions. Let’s evaluate how good it is by projecting the data.
Project data with seam
- Go to the
Layoutworkspace, select both the bounding box and probabilities underScene Collection, and pressCreate Projection; then, wait…
- When done, go back to the
UV Editingworkspace and click anywhere outside the sample to unselect the mesh.
The new data projection should be visible on the left side.
- If not, click on the image data-block picture icon and select
Channel_0_Layer_0.001.
We have successfully changed the position of the unwrapping using the seam. The clump of germ cells is now at the center of the projection.
This projection is OK, but the contents are squeezed and it is not occupying the full bounding area. We can improve this by editing the projection mesh.
Edit projected mesh
- Press
Ato select the entire mesh. - Select the tool
Transformon the left side menu.
- Hover the mouse on the right side edge to reveal the scale handle.
- Drag the right side to the right to extend the orange mesh until the edge of the projection area.
- Then, drag the other side to extend the mesh to the left side.
- Finally, make adjustments so that the projected mesh is covering most of the projection area as shown below.
Now we can use the Pinch or Grab tools to edit the mesh at the corners, so that they are not clipped out. Or, we can leave them as is (they will be clipped out of the projection).
- Use
Pinchto drag finer portions of the mesh to the corners. - Use
Grabto drag larger portions of the mesh to the corners.
At the end, the mesh should be filling more or less the projection area.
Check edited projection
To check the new projection with the edited mesh, we need to re-create the image data projection onto the edited mesh.
- Go to the
Layoutworkspace, select both the bounding box and probabilities underScene Collection, and pressCreate Projection; then, wait… - When done, switch to the
UV Editingworkspace and deselect the mesh and check the new projection (select the latest image data-block, likelyChannel_0_Layer_0.002).
Despite the unevenness of the corners (next time we can improve our pinching and grabbing skills), the edited projection is better than the first one and the tissue is oriented the way we needed. Let’s save the projection to disk.
- Go to the
Layoutworkspace, select both the bounding box and probabilities entries underScene Collection>Collection, and click onSave Projection. - Then navigate to the working directory and give the file a different suffix.
- Finally, open the newly generated files in Fiji to investigate.
The projection is ready for image analyses.
There are several other use cases that are not covered in the current version of this tutorial. For example, how can we generate a projection with more layers, or create and register projections for different timepoints. But, if you are interested, these use cases are described in the Blender Tissue Cartography paper and documentation (Claussen et al. 2025).
Citation
Vellutini, B. C. (2026). Tissue cartography using Blender. Zenodo. https://doi.org/10.5281/zenodo.18090965
License
This tutorial is available under a Creative Commons Attribution 4.0 International License.




























































































