A Haptic VR Milling Surgery Simulator - CiteSeerX

1 downloads 0 Views 178KB Size Report
Magnus ERIKSSON a, Mark DIXON b and Jan WIKANDER a. a The Mechatronics Lab/Machine Design, KTH, Stockholm, Sweden b SenseGraphics AB ...
A Haptic VR Milling Surgery Simulator – Using High-Resolution CT-Data. a

Magnus ERIKSSON a, Mark DIXON b and Jan WIKANDER a. The Mechatronics Lab/Machine Design, KTH, Stockholm, Sweden b SenseGraphics AB, Stockholm, Sweden

Abstract. A haptic virtual reality milling simulator using high resolution volumetric data is presented in this paper. We discuss the graphical rendering performed from an iso-surface generated using marching cubes with a hierarchical storage method to optimize for fast dynamic changes to the data during the milling process. We also present a stable proxy-based haptic algorithm used to maintain a tip position on the surface avoiding haptic fall-through.

Introduction The work presented here describes the use of a surgical training simulator for temporal bone milling that has been developed by the authors. This system will be used to educate and train surgeons for milling operations, e.g. complicated temporal bone operations such as removal of brain tumors. In such an operation the surgeon mills a path through the skull with a small hand held mill so that the tumor can be reached easily and with minimal damage to surrounding tissue. The milling phase of an operation of this type is difficult, critical and very time consuming. Reduction of operation duration time by only a few percent has potential for large savings in health costs. Haptic and virtual reality simulation of the bone milling process is a new and largely unexplored research topic. However, some research groups are dealing with this problem [1], [2], [3]. All these groups are at an early stage in their research, the solutions are deficient and much more development must be done in this field to find adequate solutions. Our simulator differs from the ones mentioned above in that we are using high-resolution data sets, which give us very realistic 3D visualization of the milling process. Our haptic rendering algorithm also improves on previous work giving greater stability and reducing fall-through issues. Our approach handles the challenging problem of rendering dynamic volume data in real-time and fulfills the given requirements for haptic and VR applications.

1. Graphical rendering Skull bone is represented using data acquired from a CT-scan which offers high quality imaging of bone structures. Our simulator supports importing patient specific DICOM

format volume data which makes it possible to use the simulator for surgeons wanting to practice a particular patient-specific operation. There are several different methods for graphical rendering of volumetric data. The most common are; Ray-Casting [4], 3D texture mapping [5] and Marching cubes [6]. As in [7], our simulator uses the Marching Cubes algorithm because it meets our requirements for image quality and speed. The simulator performs stereographic rendering of the surface with a minimum update rate of 30Hz and a latency of less than 300ms [8] for dynamic updates to volume data from the milling process. An optimized variation of the marching cubes algorithm has been implemented that allows real-time updating of a restricted area of the volume data using the method described herein: 1. Read in volume data and generate voxel gradients based on density. Create an octree structure with a user specified tree depth, applying marching cubes to each leaf node in the octree. 2. Check for milling, updating volume density and gradient data if necessary. 3. Apply a localized marching cubes algorithm on any modified octree leaf nodes. 4. Render octree leaf nodes using OpenGL display lists for optimized rendering of unchanged nodes. 5. Repeat from 2. These steps are described in detail below. 1.1. Read in and store the volumetric data For volumetric data an Octree-based structure [9] is well suited for visualization of the bone milling process. The Octree-based structure uses a hierarchical representation of the data to efficiently detect and update localized changes to the data. Each node in the octree represents a fixed axis-aligned sub-region of the data. Various optimizations can be implemented that take advantage of this hierarchical structure. Maintaining minimum and maximum density information for each node allows our marching cubes algorithm to trivially reject any node in the octree with all density values either higher or lower than the iso-surface density value [10]. OpenGL display list caching has also been optimized by taking advantage of the octree structure. Each node in the octree has a display list cache that caches the rendering of a particular node. Leaf node display lists cache triangle rendering of a subregion of the iso-surface. Non-leaf nodes use display lists to cache sub-regions of the octree. In this approach, any changes to a single leaf node only breaks the caches of that leaf and all of its ancestors, minimizing the number of display list caches that need to be regenerated. Octree depth is a trade-off between having too many and too few voxels in each leaf node. If there are too few voxels in each leaf node then the overhead of the octree outweighs the benefits. Too many voxels in each leaf node means that the localized updates need to update an unnecessarily large area and are thus inefficient. Experiments show that an octree depth of three or four is optimal depending on the size of the volume data.

1.2. Check for milling The haptic device has two buttons, one for removal of material and one for adding material. When either button is pressed a milling algorithm for updating voxel density is applied and a localized update of the marching cubes algorithm is performed. The tip of the mill is represented as a sphere with a radius r and an axis-aligned bounding box is used to quickly find the voxels located inside the sphere as described in figure 1. Y_max

r d Y_min X_min

X_max

Figure 1. 2D representation of the axis-aligned bounding box region.

Only voxels that fall inside the axis-aligned bounding box are checked for collision with the tool. Voxels located inside the bounding box with a distance, d, less than the radius, r, are affected by the milling tip. Adding or removing material is simulated by increasing or decreasing voxel density as a function of time and distance from the tool surface. Changes in voxel density will result in an update of the voxel gradients used to determine surface normals for the isosurface rendering. Updating the voxel density of any voxel in a particular leaf node will thus break the display list caching for that leaf node and its ancestors. 1.3 Apply the Marching cubes algorithm on the updated tree nodes After checking for changes to voxel density as a result of haptic milling, the graphical rendering algorithm will check for any octree leaf nodes that need regenerating. If an octree leaf node is out-of-date, the current iso-surface vertex and normal data is deleted and the marching cubes algorithm is applied locally to compute new vertex and normal geometry. 1.4 Isosurface rendering. The iso-surface is rendered by performing a depth-first traversal of the octree structure, checking for invalid display list caches. On the first rendering traversal all leaf-nodes will need to be rendered and a display list cache generated for them. The subsequent n-1 graphic updates (where n is the tree depth) of unchanged data will result in display list caches being generated for the parents of leaf-nodes, grand-parents, etc, until the root node is cached in a display list. Changes to voxel density as a result of haptic milling break the cache of one or more leaf-nodes resulting in one or more branches of the octree having invalid caches which will be rebuilt over the next n graphic updates. Figure 2 shows three objects that have been graphically rendered using our optimized octree approach described above. The tooth is a CT scan in DICOM format with a resolution of 256 x 256 x 176, the

skull is also a DICOM CT data file with a resolution of 512 x 512 x 176 and the sphere is a dynamically generated data set with a resolution of 256 x 256 x 256.

Figure 2. Graphical rendering of a tooth, a skull and a free-form object.

2. Haptic rendering Our haptic rendering algorithm presented here is a proxy-based rendering technique [11] that will render an arbitrary density value as a haptic surface. For high-quality haptic rendering, the haptic process needs to be run in a separate thread from the graphical updates at an update rate of 1000Hz. The haptic algorithm is as follows: 1. Calculate the density value at the position of the proxy using trilinear interpolation. 2. If the density value ≥ the iso-value ⇒ collision, update proxy position. Else ⇒ no collision and the proxy position remains the position of the probe. 3. Calculate the force = k*(proxy_position – probe_position). 4. If milling ⇒ add vibration force. 5. Return the total force to the haptic device. The haptic rendering algorithm maintains a proxy at a position where voxel density is less than the density value used for iso-surface generation. If the density value at the position of the haptic device is lower than the density value of the isosurface then we update the proxy position to be that of the haptic device. Otherwise we need to update the proxy to minimize the distance between the haptic device and the proxy whilst maintaining the requirement that the proxy remain at a position with a lower density than that of the isosurface. This is performed using an algorithm that is a variant of those described in [2, 12]. The general approach of our algorithm is to update the proxy position with a twostep movement. First we move the proxy in a direction tangential to the surface. We then compute the voxel gradient at this new location to compute a normal vector. Finally the proxy is moved along this normal vector towards the surface. The algorithm is computed with the following variables: 1. Normalized gradient, nˆ1 , at p proxy trilinear interpolation. 2.

Distance a = p probe − p proxy between haptic device and proxy.

3. 4.

Projection of a on nˆ1 , a1 = a ⋅ nˆ1 . Tangential direction, n2 = a − (a1 ⋅ nˆ1 ) ⇒ a2 = n2 and ⇒ nˆ 2 .

5.

Friction µ ⋅ a1 ⇒ Magnitude of proxy movement a3 = max(a2 − µ ⋅ a1 ,0) .

6.

New proxy position, p proxy _ tng = p proxy + a3 ⋅ nˆ 2 . nˆ1

p proxy _ tng

nˆ2

p proxy

a2 a1

a

n2

p probe

Figure 3. The algorithm for the tangential movement of the proxy.

When the new tangential proxy position has been found, the intersection point with the “surface” is derived based on the following computations: 7. The normalized gradient, nˆ3 , and the density value v proxy _ tng at p proxy _ tng 8.

If v proxy _ tng ≥ iso-value ⇒ inside the object ⇒ set the step direction dˆ = nˆ3 , else ⇒ outside the object ⇒ set dˆ = −nˆ3

9.

New proxy position p proxy _ new = p proxy _ tng + s , where s = step _ size ⋅ dˆ

10. The density value v proxy _ new at p proxy _ new Steps 9 and 10 are performed iteratively until either a point outside (or inside) the surface is found or a maximum number of iterations is reached. Linear interpolation between the last two points used in step 9 will give an approximation to a point that intersects the surface, p proxy_intersection .By computing the gradient at p proxy_intersection we can finally move the proxy away from the surface by the radius of the proxy to ensure that the proxy is located entirely outside of the surface. The haptic force is computed using a spring function between the haptic device and the proxy, F = k ⋅ ( p proxy _ surface − p probe ) . If the user has activated the milling mode then we add a small random variation to the final force to simulate the vibration of the drill.

3. Equipment and implementation Our application uses the SenseGraphics H3D API to manage graphical and haptic rendering and the synchronization between the two processes. Haptic rendering is performed using a PHANToM Omni haptic device. The workspace of the Omni is sufficient to realistically mimic a real surgery situation. One limitation with using this device is that it cannot render torque forces. Another limitation is the poor stiffness of the device, which is very important for a realistic feeling when interacting with stiff materials such as bone. The application is designed to be run in a SenseGraphics 3D Immersive Workbench, a co-located hapto-visual display system that incorporates

stereographic image rendering. This places an extra burden on the graphical rendering since objects must be rendered twice to form a stereo pair. 4. Conclusion and future work Our simulator can import patient specific DICOM data from a high-resolution CT or MRI scan to be used for both graphical and haptic rendering. The graphical rendering is performed using an iso-surface generated using an optimized marching trees algorithm that uses a hierarchical storage method to optimize for dynamic changes to the data. A proxy-based haptic rendering method is used to maintain a tip position on the surface and to avoid fall-through problems. The density values of the voxels inside the sphere are reduced during milling to simulate the removal of bone material. This simulator can also be used in other areas such as dental simulation, simulation of craniofacial surgery and freeform design/sculpting of high-resolution volumetric data sets. The skull shown in this paper has a resolution of 512*512*174 generating 2420458 triangles, rendered at a frame rate of 30 Hz. The haptic rendering loop is updated at 1000 Hz. The simulation has been tested on a Pentium 4 3.2GHz processor PC with a Quadro FX1400 graphics card. Additional features of our application include the use of cutting-planes and zoom and rotation to explore interesting regions of the data. Milling sound effects are also simulated to give a more realistic feeling to the milling process. Future work will look at improvements to the haptic rendering algorithm and stability issues that occur where two stiff materials are intersecting. We will also perform more tests to establish if the PHANToM Omni device can generate sufficient forces for simulation of the milling process. A simple particle simulation is also planned to visualize dust generated during material removal for more realistic visual rendering. The performance of the simulator will be tested and validated by surgeons. References [1]

Agus M. et al, Real-time Haptic and Visual Simulation of Bone Dissection, IEEE Virtual Reality Conference, pages 209-216, IEEE Computer Society Press, 2002. [2] Pflesser B., Volume Cutting for Virtual Petrous Bone Surgery, Comp. Aid. Surgery 7, pages 74-83, 2002. [3] Sewell C., Morris D. et al., Quantifying Risky Behavior in Surgical Simulation, Medicine Meets Virtual Reality Conference, January 27-29 2005, pp. 451-457. [4] Levoy M., Display of surfaces from volume data, Computer Graphic Applications, May 1988, pp.2937. [5] Cabral B., Cam N., Foran J., Accelerated volume rendering and tomographic reconstruction using texture mapping hardware, 1994 Symposium on Volume Visualization , 1994, pp. 91-98. [6] Lorensen W.E., Cline H.E., Marching cubes: a high resulotion 3D surface construction algorithm, Computer Graphics, 21(4), July 1987, pp. 163-169. [7] Peng X., Chi X., Ochoa J.A., Leu M.C., Bone surgery simulation with virtual reality, ASME DETC2003/CIE, Chicago USA, September 2-6 2003. [8] Mark W.R., Randolph S.C., Finch M., Verth J., Adding force feedback to graphics system: issues and solutions, 23rd Conference on Computer Graphics, 1996, pp. 447-452. [9] Foley et al, Computer Graphics: Principles and practice, 2nd edition, 1996. [10] Wilhelms J., Van Gelder A., Octrees for Faster Isosurface Generation, ACM Trans. Graphics, vol. 11, no. 3, 1992, pp. 201-227. [11] Zilles, C.B., Salisbury, J.K., A Constraint-based God-object Method for Haptic Display, Proceedings of the 1995 IEEE Conference on Intelligent Robots and Systems, Vol. 3, PA, 1995, pp. 146-151. [12] Vidholm E., Agmund J., Fast surface rendering for interactive medical image segmentation with haptic feedback, SIGRAD 2004.