Hello,<div><br></div><div>I have a 3D volume that has been segmented, then I use a polygonal mesh (polydata) to represent the segmented region. What I want to do is to render the 3D volume (orthoslice view, using vtkImagePlaneWidget) and then superimpose the corresponding mesh containing the segmentation, so as to visually inspect the segmentation results.</div>
<div><br></div><div>My pipeline so far is:</div><div><br></div><div>1. Visualization of the 3D volume</div><div>vtkMetaImageReader -> vtkImagePlaneWidget (for axial, coronal and sagittal slices) -> Rendering</div><div>
2. Visualization of the mesh</div><div>vtkXMLPolyDataReader -> Rendering</div><div><br></div><div>The pipeline works in the sense that I'm able to see in the same render window both the 3D slices and the mesh, as in this figure <a href="http://tinypic.com/view.php?pic=34pzv9w&s=6" target="_blank">http://tinypic.com/view.php?pic=34pzv9w&s=6</a>.</div>
<div><br></div><div>The problem is that the mesh and the volume rendered DO NOT share the same coordinate system and thus the rendered polydata DOES NOT lie inside the 3D volume. What could be the problem in this case?</div>
<div><br></div><div>Thanks for your help,</div><div>Miguel</div>